May 14 23:38:06.750313 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed May 14 22:09:34 -00 2025 May 14 23:38:06.750330 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e0c956f61127e47bb23a2bdeb0592b0ff91bd857e2344d0bf321acb67c279f1a May 14 23:38:06.750336 kernel: Disabled fast string operations May 14 23:38:06.750340 kernel: BIOS-provided physical RAM map: May 14 23:38:06.750344 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable May 14 23:38:06.750348 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved May 14 23:38:06.750354 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved May 14 23:38:06.750359 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable May 14 23:38:06.750365 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data May 14 23:38:06.750371 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS May 14 23:38:06.750377 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable May 14 23:38:06.750382 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved May 14 23:38:06.750386 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved May 14 23:38:06.750390 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved May 14 23:38:06.750397 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved May 14 23:38:06.750402 kernel: NX (Execute Disable) protection: active May 14 23:38:06.750406 kernel: APIC: Static calls initialized May 14 23:38:06.750411 kernel: SMBIOS 2.7 present. May 14 23:38:06.750416 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 May 14 23:38:06.750421 kernel: vmware: hypercall mode: 0x00 May 14 23:38:06.750426 kernel: Hypervisor detected: VMware May 14 23:38:06.750431 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz May 14 23:38:06.750436 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz May 14 23:38:06.750441 kernel: vmware: using clock offset of 4648933622 ns May 14 23:38:06.750446 kernel: tsc: Detected 3408.000 MHz processor May 14 23:38:06.750451 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 14 23:38:06.750457 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 14 23:38:06.750462 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 May 14 23:38:06.750467 kernel: total RAM covered: 3072M May 14 23:38:06.750472 kernel: Found optimal setting for mtrr clean up May 14 23:38:06.750478 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G May 14 23:38:06.750486 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs May 14 23:38:06.750494 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 14 23:38:06.750499 kernel: Using GB pages for direct mapping May 14 23:38:06.750511 kernel: ACPI: Early table checksum verification disabled May 14 23:38:06.750516 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) May 14 23:38:06.750521 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) May 14 23:38:06.750526 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) May 14 23:38:06.750531 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) May 14 23:38:06.750536 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 May 14 23:38:06.750544 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 May 14 23:38:06.750550 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) May 14 23:38:06.750555 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) May 14 23:38:06.750560 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) May 14 23:38:06.750565 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) May 14 23:38:06.750570 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) May 14 23:38:06.750577 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) May 14 23:38:06.750582 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] May 14 23:38:06.750587 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] May 14 23:38:06.750593 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] May 14 23:38:06.750598 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] May 14 23:38:06.750603 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] May 14 23:38:06.750608 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] May 14 23:38:06.750613 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] May 14 23:38:06.750618 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] May 14 23:38:06.750624 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] May 14 23:38:06.750631 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] May 14 23:38:06.750639 kernel: system APIC only can use physical flat May 14 23:38:06.750645 kernel: APIC: Switched APIC routing to: physical flat May 14 23:38:06.750650 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 May 14 23:38:06.750655 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 May 14 23:38:06.750660 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 May 14 23:38:06.750665 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 May 14 23:38:06.750670 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 May 14 23:38:06.750675 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 May 14 23:38:06.750682 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 May 14 23:38:06.750687 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 May 14 23:38:06.750692 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 May 14 23:38:06.750697 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 May 14 23:38:06.750701 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 May 14 23:38:06.750707 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 May 14 23:38:06.750711 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 May 14 23:38:06.750717 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 May 14 23:38:06.750722 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 May 14 23:38:06.750727 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 May 14 23:38:06.750733 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 May 14 23:38:06.750738 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 May 14 23:38:06.750745 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 May 14 23:38:06.750752 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 May 14 23:38:06.750757 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 May 14 23:38:06.750762 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 May 14 23:38:06.750767 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 May 14 23:38:06.750772 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 May 14 23:38:06.750777 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 May 14 23:38:06.750782 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 May 14 23:38:06.750788 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 May 14 23:38:06.750793 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 May 14 23:38:06.750798 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 May 14 23:38:06.750803 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 May 14 23:38:06.750808 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 May 14 23:38:06.750813 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 May 14 23:38:06.750818 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 May 14 23:38:06.750823 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 May 14 23:38:06.750828 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 May 14 23:38:06.750833 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 May 14 23:38:06.750838 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 May 14 23:38:06.750844 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 May 14 23:38:06.750849 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 May 14 23:38:06.750853 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 May 14 23:38:06.750859 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 May 14 23:38:06.750864 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 May 14 23:38:06.750869 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 May 14 23:38:06.750874 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 May 14 23:38:06.750879 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 May 14 23:38:06.750884 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 May 14 23:38:06.750891 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 May 14 23:38:06.750901 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 May 14 23:38:06.750909 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 May 14 23:38:06.750916 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 May 14 23:38:06.750924 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 May 14 23:38:06.750930 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 May 14 23:38:06.750935 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 May 14 23:38:06.750940 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 May 14 23:38:06.750945 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 May 14 23:38:06.750950 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 May 14 23:38:06.750955 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 May 14 23:38:06.750961 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 May 14 23:38:06.750966 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 May 14 23:38:06.750975 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 May 14 23:38:06.750981 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 May 14 23:38:06.750987 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 May 14 23:38:06.750992 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 May 14 23:38:06.750997 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 May 14 23:38:06.751004 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 May 14 23:38:06.751013 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 May 14 23:38:06.751021 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 May 14 23:38:06.751026 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 May 14 23:38:06.751031 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 May 14 23:38:06.751037 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 May 14 23:38:06.751042 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 May 14 23:38:06.751047 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 May 14 23:38:06.751053 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 May 14 23:38:06.751058 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 May 14 23:38:06.751063 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 May 14 23:38:06.751068 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 May 14 23:38:06.751075 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 May 14 23:38:06.751081 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 May 14 23:38:06.751086 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 May 14 23:38:06.751091 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 May 14 23:38:06.751096 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 May 14 23:38:06.751102 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 May 14 23:38:06.751107 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 May 14 23:38:06.751113 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 May 14 23:38:06.751118 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 May 14 23:38:06.751123 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 May 14 23:38:06.751130 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 May 14 23:38:06.751135 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 May 14 23:38:06.751140 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 May 14 23:38:06.751146 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 May 14 23:38:06.751153 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 May 14 23:38:06.751160 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 May 14 23:38:06.751165 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 May 14 23:38:06.751171 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 May 14 23:38:06.751176 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 May 14 23:38:06.751181 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 May 14 23:38:06.751187 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 May 14 23:38:06.751193 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 May 14 23:38:06.751199 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 May 14 23:38:06.751204 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 May 14 23:38:06.751209 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 May 14 23:38:06.751214 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 May 14 23:38:06.751220 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 May 14 23:38:06.751225 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 May 14 23:38:06.751230 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 May 14 23:38:06.751236 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 May 14 23:38:06.751241 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 May 14 23:38:06.751248 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 May 14 23:38:06.751253 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 May 14 23:38:06.751258 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 May 14 23:38:06.751265 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 May 14 23:38:06.751274 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 May 14 23:38:06.751280 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 May 14 23:38:06.751285 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 May 14 23:38:06.751290 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 May 14 23:38:06.751295 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 May 14 23:38:06.751301 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 May 14 23:38:06.751307 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 May 14 23:38:06.751313 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 May 14 23:38:06.751318 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 May 14 23:38:06.751323 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 May 14 23:38:06.751329 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 May 14 23:38:06.751334 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 May 14 23:38:06.751339 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 May 14 23:38:06.751345 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 May 14 23:38:06.751350 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 May 14 23:38:06.751355 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 May 14 23:38:06.751362 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 May 14 23:38:06.751367 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] May 14 23:38:06.751372 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] May 14 23:38:06.751378 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug May 14 23:38:06.751384 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] May 14 23:38:06.751389 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] May 14 23:38:06.751395 kernel: Zone ranges: May 14 23:38:06.751401 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 14 23:38:06.751409 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] May 14 23:38:06.751416 kernel: Normal empty May 14 23:38:06.751423 kernel: Movable zone start for each node May 14 23:38:06.751428 kernel: Early memory node ranges May 14 23:38:06.751433 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] May 14 23:38:06.751439 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] May 14 23:38:06.751444 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] May 14 23:38:06.751450 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] May 14 23:38:06.751455 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 14 23:38:06.751461 kernel: On node 0, zone DMA: 98 pages in unavailable ranges May 14 23:38:06.751466 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges May 14 23:38:06.751473 kernel: ACPI: PM-Timer IO Port: 0x1008 May 14 23:38:06.751479 kernel: system APIC only can use physical flat May 14 23:38:06.751484 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) May 14 23:38:06.751489 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) May 14 23:38:06.751495 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) May 14 23:38:06.751500 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) May 14 23:38:06.754523 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) May 14 23:38:06.754530 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) May 14 23:38:06.754536 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) May 14 23:38:06.754542 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) May 14 23:38:06.754550 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) May 14 23:38:06.754556 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) May 14 23:38:06.754561 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) May 14 23:38:06.754566 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) May 14 23:38:06.754572 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) May 14 23:38:06.754577 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) May 14 23:38:06.754583 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) May 14 23:38:06.754588 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) May 14 23:38:06.754594 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) May 14 23:38:06.754601 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) May 14 23:38:06.754606 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) May 14 23:38:06.754612 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) May 14 23:38:06.754617 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) May 14 23:38:06.754623 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) May 14 23:38:06.754628 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) May 14 23:38:06.754634 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) May 14 23:38:06.754639 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) May 14 23:38:06.754647 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) May 14 23:38:06.754655 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) May 14 23:38:06.754663 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) May 14 23:38:06.754668 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) May 14 23:38:06.754673 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) May 14 23:38:06.754679 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) May 14 23:38:06.754684 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) May 14 23:38:06.754690 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) May 14 23:38:06.754695 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) May 14 23:38:06.754700 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) May 14 23:38:06.754706 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) May 14 23:38:06.754713 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) May 14 23:38:06.754723 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) May 14 23:38:06.754729 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) May 14 23:38:06.754734 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) May 14 23:38:06.754740 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) May 14 23:38:06.754745 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) May 14 23:38:06.754751 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) May 14 23:38:06.754756 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) May 14 23:38:06.754761 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) May 14 23:38:06.754767 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) May 14 23:38:06.754772 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) May 14 23:38:06.754779 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) May 14 23:38:06.754785 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) May 14 23:38:06.754790 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) May 14 23:38:06.754795 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) May 14 23:38:06.754801 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) May 14 23:38:06.754806 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) May 14 23:38:06.754812 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) May 14 23:38:06.754817 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) May 14 23:38:06.754823 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) May 14 23:38:06.754829 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) May 14 23:38:06.754835 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) May 14 23:38:06.754840 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) May 14 23:38:06.754846 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) May 14 23:38:06.754851 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) May 14 23:38:06.754857 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) May 14 23:38:06.754862 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) May 14 23:38:06.754867 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) May 14 23:38:06.754873 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) May 14 23:38:06.754878 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) May 14 23:38:06.754886 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) May 14 23:38:06.754895 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) May 14 23:38:06.754901 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) May 14 23:38:06.754906 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) May 14 23:38:06.754911 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) May 14 23:38:06.754917 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) May 14 23:38:06.754922 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) May 14 23:38:06.754928 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) May 14 23:38:06.754933 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) May 14 23:38:06.754939 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) May 14 23:38:06.754946 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) May 14 23:38:06.754951 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) May 14 23:38:06.754958 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) May 14 23:38:06.754966 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) May 14 23:38:06.754974 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) May 14 23:38:06.754979 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) May 14 23:38:06.754984 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) May 14 23:38:06.754990 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) May 14 23:38:06.754995 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) May 14 23:38:06.755001 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) May 14 23:38:06.755008 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) May 14 23:38:06.755013 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) May 14 23:38:06.755018 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) May 14 23:38:06.755024 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) May 14 23:38:06.755029 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) May 14 23:38:06.755035 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) May 14 23:38:06.755040 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) May 14 23:38:06.755045 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) May 14 23:38:06.755051 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) May 14 23:38:06.755057 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) May 14 23:38:06.755063 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) May 14 23:38:06.755068 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) May 14 23:38:06.755074 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) May 14 23:38:06.755079 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) May 14 23:38:06.755085 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) May 14 23:38:06.755090 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) May 14 23:38:06.755095 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) May 14 23:38:06.755101 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) May 14 23:38:06.755106 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) May 14 23:38:06.755113 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) May 14 23:38:06.755120 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) May 14 23:38:06.755128 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) May 14 23:38:06.755135 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) May 14 23:38:06.755140 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) May 14 23:38:06.755146 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) May 14 23:38:06.755151 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) May 14 23:38:06.755156 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) May 14 23:38:06.755162 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) May 14 23:38:06.755167 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) May 14 23:38:06.755174 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) May 14 23:38:06.755180 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) May 14 23:38:06.755185 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) May 14 23:38:06.755190 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) May 14 23:38:06.755196 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) May 14 23:38:06.755203 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) May 14 23:38:06.755211 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) May 14 23:38:06.755217 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) May 14 23:38:06.755223 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) May 14 23:38:06.755228 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) May 14 23:38:06.755235 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) May 14 23:38:06.755241 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) May 14 23:38:06.755246 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) May 14 23:38:06.755252 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 May 14 23:38:06.755257 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) May 14 23:38:06.755263 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 14 23:38:06.755268 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 May 14 23:38:06.755274 kernel: TSC deadline timer available May 14 23:38:06.755280 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs May 14 23:38:06.755286 kernel: [mem 0x80000000-0xefffffff] available for PCI devices May 14 23:38:06.755292 kernel: Booting paravirtualized kernel on VMware hypervisor May 14 23:38:06.755298 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 14 23:38:06.755303 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 May 14 23:38:06.755309 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 May 14 23:38:06.755315 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 May 14 23:38:06.755320 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 May 14 23:38:06.755326 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 May 14 23:38:06.755331 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 May 14 23:38:06.755337 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 May 14 23:38:06.755343 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 May 14 23:38:06.755355 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 May 14 23:38:06.755362 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 May 14 23:38:06.755368 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 May 14 23:38:06.755377 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 May 14 23:38:06.755384 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 May 14 23:38:06.755390 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 May 14 23:38:06.755397 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 May 14 23:38:06.755403 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 May 14 23:38:06.755409 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 May 14 23:38:06.755414 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 May 14 23:38:06.755420 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 May 14 23:38:06.755427 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e0c956f61127e47bb23a2bdeb0592b0ff91bd857e2344d0bf321acb67c279f1a May 14 23:38:06.755433 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 14 23:38:06.755439 kernel: random: crng init done May 14 23:38:06.755446 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes May 14 23:38:06.755453 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes May 14 23:38:06.755462 kernel: printk: log_buf_len min size: 262144 bytes May 14 23:38:06.755468 kernel: printk: log_buf_len: 1048576 bytes May 14 23:38:06.755474 kernel: printk: early log buf free: 239648(91%) May 14 23:38:06.755479 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 14 23:38:06.755485 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 14 23:38:06.755491 kernel: Fallback order for Node 0: 0 May 14 23:38:06.755497 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 May 14 23:38:06.755510 kernel: Policy zone: DMA32 May 14 23:38:06.755518 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 14 23:38:06.755525 kernel: Memory: 1932268K/2096628K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 164100K reserved, 0K cma-reserved) May 14 23:38:06.755532 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 May 14 23:38:06.755538 kernel: ftrace: allocating 37993 entries in 149 pages May 14 23:38:06.755544 kernel: ftrace: allocated 149 pages with 4 groups May 14 23:38:06.755551 kernel: Dynamic Preempt: voluntary May 14 23:38:06.755557 kernel: rcu: Preemptible hierarchical RCU implementation. May 14 23:38:06.755563 kernel: rcu: RCU event tracing is enabled. May 14 23:38:06.755569 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. May 14 23:38:06.755575 kernel: Trampoline variant of Tasks RCU enabled. May 14 23:38:06.755581 kernel: Rude variant of Tasks RCU enabled. May 14 23:38:06.755588 kernel: Tracing variant of Tasks RCU enabled. May 14 23:38:06.755594 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 14 23:38:06.755599 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 May 14 23:38:06.755606 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 May 14 23:38:06.755617 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. May 14 23:38:06.755623 kernel: Console: colour VGA+ 80x25 May 14 23:38:06.755629 kernel: printk: console [tty0] enabled May 14 23:38:06.755635 kernel: printk: console [ttyS0] enabled May 14 23:38:06.755641 kernel: ACPI: Core revision 20230628 May 14 23:38:06.755647 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns May 14 23:38:06.755653 kernel: APIC: Switch to symmetric I/O mode setup May 14 23:38:06.755658 kernel: x2apic enabled May 14 23:38:06.755664 kernel: APIC: Switched APIC routing to: physical x2apic May 14 23:38:06.755671 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 14 23:38:06.755677 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns May 14 23:38:06.755683 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) May 14 23:38:06.755691 kernel: Disabled fast string operations May 14 23:38:06.755700 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 May 14 23:38:06.755706 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 May 14 23:38:06.755712 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 14 23:38:06.755718 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit May 14 23:38:06.755724 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall May 14 23:38:06.755731 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS May 14 23:38:06.755737 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT May 14 23:38:06.755743 kernel: RETBleed: Mitigation: Enhanced IBRS May 14 23:38:06.755748 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 14 23:38:06.755754 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 14 23:38:06.755760 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode May 14 23:38:06.755766 kernel: SRBDS: Unknown: Dependent on hypervisor status May 14 23:38:06.755772 kernel: GDS: Unknown: Dependent on hypervisor status May 14 23:38:06.755778 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 14 23:38:06.755785 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 14 23:38:06.755791 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 14 23:38:06.755797 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 14 23:38:06.755803 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 14 23:38:06.755809 kernel: Freeing SMP alternatives memory: 32K May 14 23:38:06.755815 kernel: pid_max: default: 131072 minimum: 1024 May 14 23:38:06.755821 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 14 23:38:06.755827 kernel: landlock: Up and running. May 14 23:38:06.755832 kernel: SELinux: Initializing. May 14 23:38:06.755839 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 14 23:38:06.755845 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 14 23:38:06.755851 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) May 14 23:38:06.755859 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 14 23:38:06.755868 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 14 23:38:06.755874 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. May 14 23:38:06.755880 kernel: Performance Events: Skylake events, core PMU driver. May 14 23:38:06.755887 kernel: core: CPUID marked event: 'cpu cycles' unavailable May 14 23:38:06.755894 kernel: core: CPUID marked event: 'instructions' unavailable May 14 23:38:06.755900 kernel: core: CPUID marked event: 'bus cycles' unavailable May 14 23:38:06.755905 kernel: core: CPUID marked event: 'cache references' unavailable May 14 23:38:06.755911 kernel: core: CPUID marked event: 'cache misses' unavailable May 14 23:38:06.755917 kernel: core: CPUID marked event: 'branch instructions' unavailable May 14 23:38:06.755923 kernel: core: CPUID marked event: 'branch misses' unavailable May 14 23:38:06.755928 kernel: ... version: 1 May 14 23:38:06.755934 kernel: ... bit width: 48 May 14 23:38:06.755940 kernel: ... generic registers: 4 May 14 23:38:06.755950 kernel: ... value mask: 0000ffffffffffff May 14 23:38:06.755958 kernel: ... max period: 000000007fffffff May 14 23:38:06.755964 kernel: ... fixed-purpose events: 0 May 14 23:38:06.755969 kernel: ... event mask: 000000000000000f May 14 23:38:06.755975 kernel: signal: max sigframe size: 1776 May 14 23:38:06.755981 kernel: rcu: Hierarchical SRCU implementation. May 14 23:38:06.755987 kernel: rcu: Max phase no-delay instances is 400. May 14 23:38:06.755993 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 14 23:38:06.755999 kernel: smp: Bringing up secondary CPUs ... May 14 23:38:06.756006 kernel: smpboot: x86: Booting SMP configuration: May 14 23:38:06.756013 kernel: .... node #0, CPUs: #1 May 14 23:38:06.756019 kernel: Disabled fast string operations May 14 23:38:06.756025 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 May 14 23:38:06.756030 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 May 14 23:38:06.756036 kernel: smp: Brought up 1 node, 2 CPUs May 14 23:38:06.756042 kernel: smpboot: Max logical packages: 128 May 14 23:38:06.756048 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) May 14 23:38:06.756054 kernel: devtmpfs: initialized May 14 23:38:06.756060 kernel: x86/mm: Memory block size: 128MB May 14 23:38:06.756067 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) May 14 23:38:06.756073 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 14 23:38:06.756079 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) May 14 23:38:06.756085 kernel: pinctrl core: initialized pinctrl subsystem May 14 23:38:06.756091 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 14 23:38:06.756097 kernel: audit: initializing netlink subsys (disabled) May 14 23:38:06.756103 kernel: audit: type=2000 audit(1747265885.065:1): state=initialized audit_enabled=0 res=1 May 14 23:38:06.756109 kernel: thermal_sys: Registered thermal governor 'step_wise' May 14 23:38:06.756117 kernel: thermal_sys: Registered thermal governor 'user_space' May 14 23:38:06.756127 kernel: cpuidle: using governor menu May 14 23:38:06.756133 kernel: Simple Boot Flag at 0x36 set to 0x80 May 14 23:38:06.756139 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 14 23:38:06.756145 kernel: dca service started, version 1.12.1 May 14 23:38:06.756151 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) May 14 23:38:06.756156 kernel: PCI: Using configuration type 1 for base access May 14 23:38:06.756162 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 14 23:38:06.756168 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 14 23:38:06.756174 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 14 23:38:06.756181 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 14 23:38:06.756189 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 14 23:38:06.756197 kernel: ACPI: Added _OSI(Module Device) May 14 23:38:06.756203 kernel: ACPI: Added _OSI(Processor Device) May 14 23:38:06.756209 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 14 23:38:06.756215 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 14 23:38:06.756221 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 14 23:38:06.756227 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored May 14 23:38:06.756233 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 14 23:38:06.756240 kernel: ACPI: Interpreter enabled May 14 23:38:06.756246 kernel: ACPI: PM: (supports S0 S1 S5) May 14 23:38:06.756252 kernel: ACPI: Using IOAPIC for interrupt routing May 14 23:38:06.756258 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 14 23:38:06.756263 kernel: PCI: Using E820 reservations for host bridge windows May 14 23:38:06.756270 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F May 14 23:38:06.756276 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) May 14 23:38:06.756363 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 14 23:38:06.756425 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] May 14 23:38:06.756486 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] May 14 23:38:06.756496 kernel: PCI host bridge to bus 0000:00 May 14 23:38:06.759345 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 14 23:38:06.759398 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] May 14 23:38:06.759445 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 14 23:38:06.759498 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 14 23:38:06.759560 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] May 14 23:38:06.759617 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] May 14 23:38:06.759678 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 May 14 23:38:06.759744 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 May 14 23:38:06.759809 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 May 14 23:38:06.759876 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a May 14 23:38:06.759934 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] May 14 23:38:06.759989 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] May 14 23:38:06.760047 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] May 14 23:38:06.760111 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] May 14 23:38:06.760165 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] May 14 23:38:06.760221 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 May 14 23:38:06.760287 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI May 14 23:38:06.760341 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB May 14 23:38:06.760407 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 May 14 23:38:06.760461 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] May 14 23:38:06.760536 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] May 14 23:38:06.760598 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 May 14 23:38:06.760659 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] May 14 23:38:06.760715 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] May 14 23:38:06.760767 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] May 14 23:38:06.760826 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] May 14 23:38:06.760884 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 14 23:38:06.760943 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 May 14 23:38:06.761001 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.761065 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold May 14 23:38:06.761128 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.761188 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold May 14 23:38:06.761246 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.761303 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold May 14 23:38:06.761362 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.761425 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold May 14 23:38:06.761485 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.762898 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold May 14 23:38:06.762963 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763023 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold May 14 23:38:06.763084 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763149 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold May 14 23:38:06.763212 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763267 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold May 14 23:38:06.763332 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763394 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold May 14 23:38:06.763454 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763525 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold May 14 23:38:06.763594 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763654 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold May 14 23:38:06.763714 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763767 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold May 14 23:38:06.763832 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.763886 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold May 14 23:38:06.763973 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.764439 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold May 14 23:38:06.764556 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.764617 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold May 14 23:38:06.764682 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.764739 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold May 14 23:38:06.764811 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.764865 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold May 14 23:38:06.764921 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.764982 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold May 14 23:38:06.765046 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.765104 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold May 14 23:38:06.765164 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.765226 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold May 14 23:38:06.765288 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.765381 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold May 14 23:38:06.765460 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.765582 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold May 14 23:38:06.765646 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.765703 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold May 14 23:38:06.765763 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.765821 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold May 14 23:38:06.765879 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.765932 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold May 14 23:38:06.765998 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.766054 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold May 14 23:38:06.766120 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.766172 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold May 14 23:38:06.766228 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.766287 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold May 14 23:38:06.766343 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.766398 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold May 14 23:38:06.766454 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.766517 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold May 14 23:38:06.766582 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.766638 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold May 14 23:38:06.766698 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 May 14 23:38:06.766759 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold May 14 23:38:06.766817 kernel: pci_bus 0000:01: extended config space not accessible May 14 23:38:06.766874 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 14 23:38:06.766927 kernel: pci_bus 0000:02: extended config space not accessible May 14 23:38:06.766936 kernel: acpiphp: Slot [32] registered May 14 23:38:06.766943 kernel: acpiphp: Slot [33] registered May 14 23:38:06.766949 kernel: acpiphp: Slot [34] registered May 14 23:38:06.766957 kernel: acpiphp: Slot [35] registered May 14 23:38:06.766962 kernel: acpiphp: Slot [36] registered May 14 23:38:06.766971 kernel: acpiphp: Slot [37] registered May 14 23:38:06.766980 kernel: acpiphp: Slot [38] registered May 14 23:38:06.766986 kernel: acpiphp: Slot [39] registered May 14 23:38:06.766992 kernel: acpiphp: Slot [40] registered May 14 23:38:06.766998 kernel: acpiphp: Slot [41] registered May 14 23:38:06.767004 kernel: acpiphp: Slot [42] registered May 14 23:38:06.767010 kernel: acpiphp: Slot [43] registered May 14 23:38:06.767016 kernel: acpiphp: Slot [44] registered May 14 23:38:06.767031 kernel: acpiphp: Slot [45] registered May 14 23:38:06.767037 kernel: acpiphp: Slot [46] registered May 14 23:38:06.767043 kernel: acpiphp: Slot [47] registered May 14 23:38:06.767049 kernel: acpiphp: Slot [48] registered May 14 23:38:06.767054 kernel: acpiphp: Slot [49] registered May 14 23:38:06.767060 kernel: acpiphp: Slot [50] registered May 14 23:38:06.767066 kernel: acpiphp: Slot [51] registered May 14 23:38:06.767072 kernel: acpiphp: Slot [52] registered May 14 23:38:06.767078 kernel: acpiphp: Slot [53] registered May 14 23:38:06.767087 kernel: acpiphp: Slot [54] registered May 14 23:38:06.767096 kernel: acpiphp: Slot [55] registered May 14 23:38:06.767102 kernel: acpiphp: Slot [56] registered May 14 23:38:06.767108 kernel: acpiphp: Slot [57] registered May 14 23:38:06.767114 kernel: acpiphp: Slot [58] registered May 14 23:38:06.767120 kernel: acpiphp: Slot [59] registered May 14 23:38:06.767126 kernel: acpiphp: Slot [60] registered May 14 23:38:06.767131 kernel: acpiphp: Slot [61] registered May 14 23:38:06.767137 kernel: acpiphp: Slot [62] registered May 14 23:38:06.767143 kernel: acpiphp: Slot [63] registered May 14 23:38:06.767201 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) May 14 23:38:06.767273 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] May 14 23:38:06.767327 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] May 14 23:38:06.767389 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] May 14 23:38:06.767441 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) May 14 23:38:06.767493 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) May 14 23:38:06.767573 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) May 14 23:38:06.767635 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) May 14 23:38:06.767688 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) May 14 23:38:06.767759 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 May 14 23:38:06.767824 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] May 14 23:38:06.767882 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] May 14 23:38:06.767942 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] May 14 23:38:06.767995 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold May 14 23:38:06.768061 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' May 14 23:38:06.768117 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 14 23:38:06.768181 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] May 14 23:38:06.768234 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] May 14 23:38:06.768292 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 14 23:38:06.768348 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] May 14 23:38:06.768405 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] May 14 23:38:06.768462 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] May 14 23:38:06.768586 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 14 23:38:06.768642 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] May 14 23:38:06.768701 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] May 14 23:38:06.768754 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] May 14 23:38:06.768807 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 14 23:38:06.768865 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] May 14 23:38:06.768920 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] May 14 23:38:06.768980 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 14 23:38:06.769036 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] May 14 23:38:06.769087 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] May 14 23:38:06.769150 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 14 23:38:06.769213 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] May 14 23:38:06.769269 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] May 14 23:38:06.769323 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 14 23:38:06.769386 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] May 14 23:38:06.769448 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] May 14 23:38:06.769528 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 14 23:38:06.769596 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] May 14 23:38:06.769656 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] May 14 23:38:06.769723 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 May 14 23:38:06.769782 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] May 14 23:38:06.769836 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] May 14 23:38:06.769899 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] May 14 23:38:06.769963 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] May 14 23:38:06.770017 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] May 14 23:38:06.770072 kernel: pci 0000:0b:00.0: supports D1 D2 May 14 23:38:06.770136 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold May 14 23:38:06.770203 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' May 14 23:38:06.770267 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 14 23:38:06.770321 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] May 14 23:38:06.770384 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] May 14 23:38:06.770448 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 14 23:38:06.772516 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] May 14 23:38:06.772589 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] May 14 23:38:06.772652 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] May 14 23:38:06.772714 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 14 23:38:06.772775 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] May 14 23:38:06.772829 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] May 14 23:38:06.772881 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] May 14 23:38:06.772943 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 14 23:38:06.773003 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] May 14 23:38:06.773057 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] May 14 23:38:06.773109 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 14 23:38:06.773165 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] May 14 23:38:06.773232 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] May 14 23:38:06.773296 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 14 23:38:06.773350 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] May 14 23:38:06.773403 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] May 14 23:38:06.773456 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 14 23:38:06.773530 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] May 14 23:38:06.773594 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] May 14 23:38:06.773651 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 14 23:38:06.773705 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] May 14 23:38:06.773765 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] May 14 23:38:06.773820 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 14 23:38:06.773880 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] May 14 23:38:06.773932 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] May 14 23:38:06.773985 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] May 14 23:38:06.774048 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 14 23:38:06.774108 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] May 14 23:38:06.774166 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] May 14 23:38:06.774224 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] May 14 23:38:06.774279 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 14 23:38:06.774339 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] May 14 23:38:06.774402 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] May 14 23:38:06.774457 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] May 14 23:38:06.774547 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 14 23:38:06.774603 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] May 14 23:38:06.774667 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] May 14 23:38:06.774721 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 14 23:38:06.774781 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] May 14 23:38:06.774833 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] May 14 23:38:06.774887 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 14 23:38:06.774949 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] May 14 23:38:06.775015 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] May 14 23:38:06.775068 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 14 23:38:06.775121 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] May 14 23:38:06.775180 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] May 14 23:38:06.775235 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 14 23:38:06.775296 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] May 14 23:38:06.775349 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] May 14 23:38:06.775402 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 14 23:38:06.775467 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] May 14 23:38:06.775903 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] May 14 23:38:06.775972 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] May 14 23:38:06.776034 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 14 23:38:06.776090 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] May 14 23:38:06.776142 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] May 14 23:38:06.776202 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] May 14 23:38:06.776264 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 14 23:38:06.776324 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] May 14 23:38:06.776376 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] May 14 23:38:06.776434 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 14 23:38:06.776491 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] May 14 23:38:06.776696 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] May 14 23:38:06.776761 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 14 23:38:06.776822 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] May 14 23:38:06.776878 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] May 14 23:38:06.776941 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 14 23:38:06.776995 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] May 14 23:38:06.777055 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] May 14 23:38:06.777110 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 14 23:38:06.777163 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] May 14 23:38:06.777234 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] May 14 23:38:06.777292 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 14 23:38:06.777351 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] May 14 23:38:06.777407 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] May 14 23:38:06.777416 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 May 14 23:38:06.777423 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 May 14 23:38:06.777429 kernel: ACPI: PCI: Interrupt link LNKB disabled May 14 23:38:06.777435 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 14 23:38:06.777441 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 May 14 23:38:06.777447 kernel: iommu: Default domain type: Translated May 14 23:38:06.777453 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 14 23:38:06.777461 kernel: PCI: Using ACPI for IRQ routing May 14 23:38:06.777467 kernel: PCI: pci_cache_line_size set to 64 bytes May 14 23:38:06.777473 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] May 14 23:38:06.777479 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] May 14 23:38:06.777540 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device May 14 23:38:06.777593 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible May 14 23:38:06.777646 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 14 23:38:06.777655 kernel: vgaarb: loaded May 14 23:38:06.777661 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 May 14 23:38:06.777670 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter May 14 23:38:06.777676 kernel: clocksource: Switched to clocksource tsc-early May 14 23:38:06.777682 kernel: VFS: Disk quotas dquot_6.6.0 May 14 23:38:06.777688 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 14 23:38:06.777696 kernel: pnp: PnP ACPI init May 14 23:38:06.777757 kernel: system 00:00: [io 0x1000-0x103f] has been reserved May 14 23:38:06.777806 kernel: system 00:00: [io 0x1040-0x104f] has been reserved May 14 23:38:06.777864 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved May 14 23:38:06.777919 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved May 14 23:38:06.777971 kernel: pnp 00:06: [dma 2] May 14 23:38:06.778026 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved May 14 23:38:06.778085 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved May 14 23:38:06.778133 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved May 14 23:38:06.778142 kernel: pnp: PnP ACPI: found 8 devices May 14 23:38:06.778148 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 14 23:38:06.778157 kernel: NET: Registered PF_INET protocol family May 14 23:38:06.778163 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 14 23:38:06.778169 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 14 23:38:06.778176 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 14 23:38:06.778182 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 14 23:38:06.778188 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 14 23:38:06.778194 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 14 23:38:06.778200 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 14 23:38:06.778206 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 14 23:38:06.778213 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 14 23:38:06.778221 kernel: NET: Registered PF_XDP protocol family May 14 23:38:06.778281 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 May 14 23:38:06.778336 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 14 23:38:06.778390 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 14 23:38:06.778445 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 14 23:38:06.778521 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 14 23:38:06.778588 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 May 14 23:38:06.778642 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 May 14 23:38:06.778695 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 May 14 23:38:06.778757 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 May 14 23:38:06.778811 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 May 14 23:38:06.778868 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 May 14 23:38:06.778921 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 May 14 23:38:06.778984 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 May 14 23:38:06.779038 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 May 14 23:38:06.779090 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 May 14 23:38:06.779149 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 May 14 23:38:06.779207 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 May 14 23:38:06.779260 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 May 14 23:38:06.779312 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 May 14 23:38:06.779375 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 May 14 23:38:06.779433 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 May 14 23:38:06.779492 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 May 14 23:38:06.779558 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 May 14 23:38:06.779614 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] May 14 23:38:06.779674 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] May 14 23:38:06.779735 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.779788 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.779842 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.779899 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.779957 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780020 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780073 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780126 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780187 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780246 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780305 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780358 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780415 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780467 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780544 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780600 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780661 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780714 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780769 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780829 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780892 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.780946 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.780998 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781051 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781109 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781167 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781225 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781278 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781339 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781393 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781456 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781591 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781648 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781701 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781753 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781807 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781870 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.781923 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.781986 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782040 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782091 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782152 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782206 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782257 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782318 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782375 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782427 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782479 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782539 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782599 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782652 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782711 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782765 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782817 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782878 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.782934 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.782985 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783042 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783100 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783153 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783206 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783263 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783328 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783381 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783441 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783496 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783580 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783644 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783702 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783755 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783819 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783882 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.783938 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.783995 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.784059 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] May 14 23:38:06.784113 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.784171 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] May 14 23:38:06.784234 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.784288 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] May 14 23:38:06.784341 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.784402 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] May 14 23:38:06.784456 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.785681 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] May 14 23:38:06.785751 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] May 14 23:38:06.785808 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] May 14 23:38:06.785871 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] May 14 23:38:06.785925 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] May 14 23:38:06.785978 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] May 14 23:38:06.786038 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] May 14 23:38:06.786096 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] May 14 23:38:06.786150 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] May 14 23:38:06.786207 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] May 14 23:38:06.786270 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] May 14 23:38:06.786322 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] May 14 23:38:06.786382 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] May 14 23:38:06.786440 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] May 14 23:38:06.786493 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] May 14 23:38:06.787638 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] May 14 23:38:06.787707 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] May 14 23:38:06.787765 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] May 14 23:38:06.787822 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] May 14 23:38:06.787883 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] May 14 23:38:06.787937 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] May 14 23:38:06.787989 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] May 14 23:38:06.788053 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] May 14 23:38:06.788107 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] May 14 23:38:06.788159 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] May 14 23:38:06.788221 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] May 14 23:38:06.788277 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] May 14 23:38:06.788330 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] May 14 23:38:06.788383 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] May 14 23:38:06.788444 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] May 14 23:38:06.788498 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] May 14 23:38:06.788607 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] May 14 23:38:06.788660 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] May 14 23:38:06.788717 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] May 14 23:38:06.788780 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] May 14 23:38:06.788837 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] May 14 23:38:06.788891 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] May 14 23:38:06.788950 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] May 14 23:38:06.789003 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] May 14 23:38:06.789055 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] May 14 23:38:06.789108 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] May 14 23:38:06.789170 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] May 14 23:38:06.789225 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] May 14 23:38:06.789283 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] May 14 23:38:06.789345 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] May 14 23:38:06.789397 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] May 14 23:38:06.789450 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] May 14 23:38:06.789769 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] May 14 23:38:06.789829 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] May 14 23:38:06.789890 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] May 14 23:38:06.789946 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] May 14 23:38:06.790001 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] May 14 23:38:06.790066 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] May 14 23:38:06.790120 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] May 14 23:38:06.790173 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] May 14 23:38:06.790241 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] May 14 23:38:06.790295 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] May 14 23:38:06.790352 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] May 14 23:38:06.790411 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] May 14 23:38:06.790464 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] May 14 23:38:06.790572 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] May 14 23:38:06.790638 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] May 14 23:38:06.790694 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] May 14 23:38:06.790755 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] May 14 23:38:06.791150 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] May 14 23:38:06.791210 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] May 14 23:38:06.791266 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] May 14 23:38:06.791321 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] May 14 23:38:06.791384 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] May 14 23:38:06.791437 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] May 14 23:38:06.791498 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] May 14 23:38:06.791572 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] May 14 23:38:06.791625 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] May 14 23:38:06.791689 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] May 14 23:38:06.791743 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] May 14 23:38:06.791796 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] May 14 23:38:06.791858 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] May 14 23:38:06.791912 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] May 14 23:38:06.791966 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] May 14 23:38:06.792020 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] May 14 23:38:06.792084 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] May 14 23:38:06.792139 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] May 14 23:38:06.792192 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] May 14 23:38:06.792252 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] May 14 23:38:06.792309 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] May 14 23:38:06.792362 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] May 14 23:38:06.792446 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] May 14 23:38:06.792747 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] May 14 23:38:06.792813 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] May 14 23:38:06.792869 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] May 14 23:38:06.792935 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] May 14 23:38:06.792989 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] May 14 23:38:06.793052 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] May 14 23:38:06.793106 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] May 14 23:38:06.793160 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] May 14 23:38:06.793438 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] May 14 23:38:06.793508 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] May 14 23:38:06.794446 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] May 14 23:38:06.794523 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] May 14 23:38:06.794586 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] May 14 23:38:06.794640 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] May 14 23:38:06.794695 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] May 14 23:38:06.794747 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] May 14 23:38:06.794800 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] May 14 23:38:06.794855 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] May 14 23:38:06.794923 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] May 14 23:38:06.794976 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] May 14 23:38:06.795029 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] May 14 23:38:06.795082 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] May 14 23:38:06.795139 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] May 14 23:38:06.795192 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] May 14 23:38:06.795245 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] May 14 23:38:06.795298 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] May 14 23:38:06.795351 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] May 14 23:38:06.795403 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] May 14 23:38:06.795455 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] May 14 23:38:06.795516 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] May 14 23:38:06.795565 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] May 14 23:38:06.795624 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] May 14 23:38:06.795674 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] May 14 23:38:06.795720 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] May 14 23:38:06.795771 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] May 14 23:38:06.795820 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] May 14 23:38:06.795867 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] May 14 23:38:06.795920 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] May 14 23:38:06.795971 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] May 14 23:38:06.796019 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] May 14 23:38:06.796067 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] May 14 23:38:06.796114 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] May 14 23:38:06.796168 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] May 14 23:38:06.796217 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] May 14 23:38:06.796264 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] May 14 23:38:06.796317 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] May 14 23:38:06.796368 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] May 14 23:38:06.796416 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] May 14 23:38:06.796468 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] May 14 23:38:06.797548 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] May 14 23:38:06.797616 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] May 14 23:38:06.797672 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] May 14 23:38:06.797727 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] May 14 23:38:06.797793 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] May 14 23:38:06.797843 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] May 14 23:38:06.797896 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] May 14 23:38:06.797945 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] May 14 23:38:06.798005 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] May 14 23:38:06.798058 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] May 14 23:38:06.798109 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] May 14 23:38:06.798167 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] May 14 23:38:06.798230 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] May 14 23:38:06.798279 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] May 14 23:38:06.798336 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] May 14 23:38:06.798393 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] May 14 23:38:06.798442 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] May 14 23:38:06.798547 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] May 14 23:38:06.798605 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] May 14 23:38:06.798654 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] May 14 23:38:06.798701 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] May 14 23:38:06.798764 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] May 14 23:38:06.798817 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] May 14 23:38:06.798871 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] May 14 23:38:06.798926 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] May 14 23:38:06.798979 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] May 14 23:38:06.799027 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] May 14 23:38:06.799088 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] May 14 23:38:06.799141 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] May 14 23:38:06.799193 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] May 14 23:38:06.799253 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] May 14 23:38:06.799307 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] May 14 23:38:06.799355 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] May 14 23:38:06.799403 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] May 14 23:38:06.799454 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] May 14 23:38:06.799529 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] May 14 23:38:06.799580 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] May 14 23:38:06.799637 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] May 14 23:38:06.799690 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] May 14 23:38:06.799739 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] May 14 23:38:06.799793 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] May 14 23:38:06.799856 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] May 14 23:38:06.799909 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] May 14 23:38:06.799957 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] May 14 23:38:06.800019 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] May 14 23:38:06.800069 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] May 14 23:38:06.800121 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] May 14 23:38:06.800172 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] May 14 23:38:06.800237 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] May 14 23:38:06.800291 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] May 14 23:38:06.800343 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] May 14 23:38:06.800401 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] May 14 23:38:06.800453 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] May 14 23:38:06.800525 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] May 14 23:38:06.800587 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] May 14 23:38:06.800636 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] May 14 23:38:06.800687 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] May 14 23:38:06.800740 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] May 14 23:38:06.800797 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] May 14 23:38:06.800846 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] May 14 23:38:06.800903 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] May 14 23:38:06.800953 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] May 14 23:38:06.801013 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] May 14 23:38:06.801063 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] May 14 23:38:06.801115 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] May 14 23:38:06.801175 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] May 14 23:38:06.801231 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] May 14 23:38:06.801280 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] May 14 23:38:06.801346 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 14 23:38:06.801356 kernel: PCI: CLS 32 bytes, default 64 May 14 23:38:06.801363 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer May 14 23:38:06.801370 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns May 14 23:38:06.801376 kernel: clocksource: Switched to clocksource tsc May 14 23:38:06.801385 kernel: Initialise system trusted keyrings May 14 23:38:06.801391 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 14 23:38:06.801397 kernel: Key type asymmetric registered May 14 23:38:06.801404 kernel: Asymmetric key parser 'x509' registered May 14 23:38:06.801410 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 14 23:38:06.801417 kernel: io scheduler mq-deadline registered May 14 23:38:06.801423 kernel: io scheduler kyber registered May 14 23:38:06.801429 kernel: io scheduler bfq registered May 14 23:38:06.801486 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 May 14 23:38:06.801643 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.801699 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 May 14 23:38:06.801763 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.801818 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 May 14 23:38:06.801871 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.801931 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 May 14 23:38:06.801986 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802043 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 May 14 23:38:06.802104 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802158 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 May 14 23:38:06.802211 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802275 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 May 14 23:38:06.802332 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802386 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 May 14 23:38:06.802439 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802499 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 May 14 23:38:06.802572 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802630 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 May 14 23:38:06.802687 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802746 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 May 14 23:38:06.802802 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802863 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 May 14 23:38:06.802918 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.802977 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 May 14 23:38:06.803034 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803091 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 May 14 23:38:06.803145 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803198 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 May 14 23:38:06.803266 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803321 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 May 14 23:38:06.803389 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803443 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 May 14 23:38:06.803498 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803579 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 May 14 23:38:06.803634 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803688 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 May 14 23:38:06.803750 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803808 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 May 14 23:38:06.803861 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.803914 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 May 14 23:38:06.803974 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804030 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 May 14 23:38:06.804085 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804147 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 May 14 23:38:06.804201 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804262 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 May 14 23:38:06.804326 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804383 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 May 14 23:38:06.804436 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804497 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 May 14 23:38:06.804583 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804637 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 May 14 23:38:06.804690 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804751 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 May 14 23:38:06.804808 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804869 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 May 14 23:38:06.804924 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.804977 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 May 14 23:38:06.805036 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.805094 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 May 14 23:38:06.805149 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.805209 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 May 14 23:38:06.805265 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ May 14 23:38:06.805275 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 14 23:38:06.805282 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 14 23:38:06.805290 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 14 23:38:06.805297 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 May 14 23:38:06.805303 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 14 23:38:06.805309 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 14 23:38:06.805363 kernel: rtc_cmos 00:01: registered as rtc0 May 14 23:38:06.805413 kernel: rtc_cmos 00:01: setting system clock to 2025-05-14T23:38:06 UTC (1747265886) May 14 23:38:06.805472 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram May 14 23:38:06.805482 kernel: intel_pstate: CPU model not supported May 14 23:38:06.805491 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 14 23:38:06.805497 kernel: NET: Registered PF_INET6 protocol family May 14 23:38:06.805510 kernel: Segment Routing with IPv6 May 14 23:38:06.805517 kernel: In-situ OAM (IOAM) with IPv6 May 14 23:38:06.805523 kernel: NET: Registered PF_PACKET protocol family May 14 23:38:06.805529 kernel: Key type dns_resolver registered May 14 23:38:06.805535 kernel: IPI shorthand broadcast: enabled May 14 23:38:06.805542 kernel: sched_clock: Marking stable (887384982, 225672456)->(1174290415, -61232977) May 14 23:38:06.805549 kernel: registered taskstats version 1 May 14 23:38:06.805557 kernel: Loading compiled-in X.509 certificates May 14 23:38:06.805564 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 4f9bc5b8797c7efeb1fcd74892dea83a6cb9d390' May 14 23:38:06.805570 kernel: Key type .fscrypt registered May 14 23:38:06.805576 kernel: Key type fscrypt-provisioning registered May 14 23:38:06.805583 kernel: ima: No TPM chip found, activating TPM-bypass! May 14 23:38:06.805593 kernel: ima: Allocated hash algorithm: sha1 May 14 23:38:06.805601 kernel: ima: No architecture policies found May 14 23:38:06.805607 kernel: clk: Disabling unused clocks May 14 23:38:06.805613 kernel: Freeing unused kernel image (initmem) memory: 43604K May 14 23:38:06.805621 kernel: Write protecting the kernel read-only data: 40960k May 14 23:38:06.805627 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 14 23:38:06.805634 kernel: Run /init as init process May 14 23:38:06.805640 kernel: with arguments: May 14 23:38:06.805646 kernel: /init May 14 23:38:06.805652 kernel: with environment: May 14 23:38:06.805659 kernel: HOME=/ May 14 23:38:06.805664 kernel: TERM=linux May 14 23:38:06.805670 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 14 23:38:06.805679 systemd[1]: Successfully made /usr/ read-only. May 14 23:38:06.805688 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 23:38:06.805695 systemd[1]: Detected virtualization vmware. May 14 23:38:06.805702 systemd[1]: Detected architecture x86-64. May 14 23:38:06.805708 systemd[1]: Running in initrd. May 14 23:38:06.805714 systemd[1]: No hostname configured, using default hostname. May 14 23:38:06.805721 systemd[1]: Hostname set to . May 14 23:38:06.805729 systemd[1]: Initializing machine ID from random generator. May 14 23:38:06.805735 systemd[1]: Queued start job for default target initrd.target. May 14 23:38:06.805742 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 23:38:06.805748 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 23:38:06.805756 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 14 23:38:06.805764 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 23:38:06.805774 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 14 23:38:06.805783 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 14 23:38:06.805790 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 14 23:38:06.805797 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 14 23:38:06.805803 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 23:38:06.805810 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 23:38:06.805817 systemd[1]: Reached target paths.target - Path Units. May 14 23:38:06.805823 systemd[1]: Reached target slices.target - Slice Units. May 14 23:38:06.805830 systemd[1]: Reached target swap.target - Swaps. May 14 23:38:06.805838 systemd[1]: Reached target timers.target - Timer Units. May 14 23:38:06.805844 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 14 23:38:06.805851 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 23:38:06.805857 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 14 23:38:06.805864 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 14 23:38:06.805871 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 23:38:06.805877 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 23:38:06.805884 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 23:38:06.805890 systemd[1]: Reached target sockets.target - Socket Units. May 14 23:38:06.805898 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 14 23:38:06.805905 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 23:38:06.805912 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 14 23:38:06.805920 systemd[1]: Starting systemd-fsck-usr.service... May 14 23:38:06.805931 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 23:38:06.805937 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 23:38:06.805944 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 23:38:06.805950 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 14 23:38:06.805957 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 23:38:06.805965 systemd[1]: Finished systemd-fsck-usr.service. May 14 23:38:06.805972 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 23:38:06.805996 systemd-journald[218]: Collecting audit messages is disabled. May 14 23:38:06.806016 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 23:38:06.806023 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 14 23:38:06.806029 kernel: Bridge firewalling registered May 14 23:38:06.806036 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 23:38:06.806043 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 23:38:06.806051 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 23:38:06.806058 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 23:38:06.806064 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 23:38:06.806071 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 23:38:06.806078 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 23:38:06.806085 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 23:38:06.806091 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 14 23:38:06.806099 systemd-journald[218]: Journal started May 14 23:38:06.806115 systemd-journald[218]: Runtime Journal (/run/log/journal/4a728f143b204744a94c87f2c1a3052a) is 4.8M, max 38.6M, 33.7M free. May 14 23:38:06.747781 systemd-modules-load[219]: Inserted module 'overlay' May 14 23:38:06.765424 systemd-modules-load[219]: Inserted module 'br_netfilter' May 14 23:38:06.809820 systemd[1]: Started systemd-journald.service - Journal Service. May 14 23:38:06.811618 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 23:38:06.815290 dracut-cmdline[239]: dracut-dracut-053 May 14 23:38:06.816970 dracut-cmdline[239]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=e0c956f61127e47bb23a2bdeb0592b0ff91bd857e2344d0bf321acb67c279f1a May 14 23:38:06.820800 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 23:38:06.822817 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 23:38:06.847026 systemd-resolved[271]: Positive Trust Anchors: May 14 23:38:06.847036 systemd-resolved[271]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 23:38:06.847059 systemd-resolved[271]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 23:38:06.849789 systemd-resolved[271]: Defaulting to hostname 'linux'. May 14 23:38:06.850409 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 23:38:06.850560 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 23:38:06.858554 kernel: SCSI subsystem initialized May 14 23:38:06.864519 kernel: Loading iSCSI transport class v2.0-870. May 14 23:38:06.873514 kernel: iscsi: registered transport (tcp) May 14 23:38:06.884518 kernel: iscsi: registered transport (qla4xxx) May 14 23:38:06.884546 kernel: QLogic iSCSI HBA Driver May 14 23:38:06.903764 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 14 23:38:06.904778 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 14 23:38:06.931296 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 14 23:38:06.931335 kernel: device-mapper: uevent: version 1.0.3 May 14 23:38:06.931345 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 14 23:38:06.961566 kernel: raid6: avx2x4 gen() 46990 MB/s May 14 23:38:06.978512 kernel: raid6: avx2x2 gen() 51818 MB/s May 14 23:38:06.995681 kernel: raid6: avx2x1 gen() 44469 MB/s May 14 23:38:06.995719 kernel: raid6: using algorithm avx2x2 gen() 51818 MB/s May 14 23:38:07.013721 kernel: raid6: .... xor() 32396 MB/s, rmw enabled May 14 23:38:07.013757 kernel: raid6: using avx2x2 recovery algorithm May 14 23:38:07.027519 kernel: xor: automatically using best checksumming function avx May 14 23:38:07.124528 kernel: Btrfs loaded, zoned=no, fsverity=no May 14 23:38:07.129952 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 14 23:38:07.130883 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 23:38:07.144930 systemd-udevd[435]: Using default interface naming scheme 'v255'. May 14 23:38:07.147835 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 23:38:07.150563 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 14 23:38:07.161105 dracut-pre-trigger[440]: rd.md=0: removing MD RAID activation May 14 23:38:07.175678 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 14 23:38:07.176788 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 23:38:07.251799 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 23:38:07.252940 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 14 23:38:07.270613 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 14 23:38:07.271156 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 14 23:38:07.271311 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 23:38:07.271600 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 23:38:07.272571 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 14 23:38:07.286827 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 14 23:38:07.313529 kernel: VMware PVSCSI driver - version 1.0.7.0-k May 14 23:38:07.322671 kernel: vmw_pvscsi: using 64bit dma May 14 23:38:07.325930 kernel: vmw_pvscsi: max_id: 16 May 14 23:38:07.325954 kernel: vmw_pvscsi: setting ring_pages to 8 May 14 23:38:07.330531 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI May 14 23:38:07.335528 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 May 14 23:38:07.339534 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps May 14 23:38:07.343858 kernel: vmw_pvscsi: enabling reqCallThreshold May 14 23:38:07.343877 kernel: vmw_pvscsi: driver-based request coalescing enabled May 14 23:38:07.343886 kernel: vmw_pvscsi: using MSI-X May 14 23:38:07.343894 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 May 14 23:38:07.347515 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 May 14 23:38:07.348539 kernel: cryptd: max_cpu_qlen set to 1000 May 14 23:38:07.353247 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 14 23:38:07.353368 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 23:38:07.354419 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 May 14 23:38:07.354424 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 23:38:07.354609 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 23:38:07.354718 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 23:38:07.356593 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 May 14 23:38:07.355827 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 14 23:38:07.357668 kernel: libata version 3.00 loaded. May 14 23:38:07.357909 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 23:38:07.362544 kernel: ata_piix 0000:00:07.1: version 2.13 May 14 23:38:07.364862 kernel: scsi host1: ata_piix May 14 23:38:07.367056 kernel: scsi host2: ata_piix May 14 23:38:07.367134 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 May 14 23:38:07.367144 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 May 14 23:38:07.374522 kernel: AVX2 version of gcm_enc/dec engaged. May 14 23:38:07.374552 kernel: AES CTR mode by8 optimization enabled May 14 23:38:07.383203 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 23:38:07.384356 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 23:38:07.406582 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 23:38:07.537540 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 May 14 23:38:07.543521 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 May 14 23:38:07.558017 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) May 14 23:38:07.558148 kernel: sd 0:0:0:0: [sda] Write Protect is off May 14 23:38:07.558218 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 May 14 23:38:07.558285 kernel: sd 0:0:0:0: [sda] Cache data unavailable May 14 23:38:07.558349 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through May 14 23:38:07.599523 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 14 23:38:07.600515 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 14 23:38:07.614595 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray May 14 23:38:07.614769 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 14 23:38:07.625635 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 14 23:38:07.839520 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (489) May 14 23:38:07.848681 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. May 14 23:38:07.854411 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. May 14 23:38:07.860038 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. May 14 23:38:07.890523 kernel: BTRFS: device fsid 267fa270-7a71-43aa-9209-0280512688b5 devid 1 transid 41 /dev/sda3 scanned by (udev-worker) (488) May 14 23:38:07.898145 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. May 14 23:38:07.898414 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. May 14 23:38:07.899173 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 14 23:38:08.229520 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 14 23:38:09.253294 disk-uuid[598]: The operation has completed successfully. May 14 23:38:09.253539 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 14 23:38:09.291750 systemd[1]: disk-uuid.service: Deactivated successfully. May 14 23:38:09.291827 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 14 23:38:09.307392 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 14 23:38:09.317476 sh[612]: Success May 14 23:38:09.325522 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" May 14 23:38:09.387343 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 14 23:38:09.389567 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 14 23:38:09.398716 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 14 23:38:09.414541 kernel: BTRFS info (device dm-0): first mount of filesystem 267fa270-7a71-43aa-9209-0280512688b5 May 14 23:38:09.414574 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 14 23:38:09.414583 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 14 23:38:09.416989 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 14 23:38:09.417006 kernel: BTRFS info (device dm-0): using free space tree May 14 23:38:09.425522 kernel: BTRFS info (device dm-0): enabling ssd optimizations May 14 23:38:09.428527 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 14 23:38:09.429362 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... May 14 23:38:09.431570 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 14 23:38:09.554050 kernel: BTRFS info (device sda6): first mount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 14 23:38:09.554105 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 14 23:38:09.554115 kernel: BTRFS info (device sda6): using free space tree May 14 23:38:09.592521 kernel: BTRFS info (device sda6): enabling ssd optimizations May 14 23:38:09.598520 kernel: BTRFS info (device sda6): last unmount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 14 23:38:09.615679 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 14 23:38:09.616582 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 14 23:38:09.760101 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. May 14 23:38:09.760974 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 14 23:38:09.808164 ignition[668]: Ignition 2.20.0 May 14 23:38:09.809064 ignition[668]: Stage: fetch-offline May 14 23:38:09.809200 ignition[668]: no configs at "/usr/lib/ignition/base.d" May 14 23:38:09.809313 ignition[668]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 14 23:38:09.809492 ignition[668]: parsed url from cmdline: "" May 14 23:38:09.809529 ignition[668]: no config URL provided May 14 23:38:09.809625 ignition[668]: reading system config file "/usr/lib/ignition/user.ign" May 14 23:38:09.809727 ignition[668]: no config at "/usr/lib/ignition/user.ign" May 14 23:38:09.810091 ignition[668]: config successfully fetched May 14 23:38:09.810109 ignition[668]: parsing config with SHA512: 98a8cf29ed7e14c3ff7d49f3288645fad2adaa49acb21652bc7b191fddc9cb6a948a495ab1b1ce44b43758ebc259c0bfb07e016b8dde075da40ca5915382f01e May 14 23:38:09.813484 unknown[668]: fetched base config from "system" May 14 23:38:09.813736 unknown[668]: fetched user config from "vmware" May 14 23:38:09.814171 ignition[668]: fetch-offline: fetch-offline passed May 14 23:38:09.814350 ignition[668]: Ignition finished successfully May 14 23:38:09.815373 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 14 23:38:09.835200 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 23:38:09.836359 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 23:38:09.855179 systemd-networkd[803]: lo: Link UP May 14 23:38:09.855186 systemd-networkd[803]: lo: Gained carrier May 14 23:38:09.856106 systemd-networkd[803]: Enumeration completed May 14 23:38:09.856361 systemd-networkd[803]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. May 14 23:38:09.856585 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 23:38:09.858656 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated May 14 23:38:09.858771 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps May 14 23:38:09.856743 systemd[1]: Reached target network.target - Network. May 14 23:38:09.856835 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 14 23:38:09.858935 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 14 23:38:09.859798 systemd-networkd[803]: ens192: Link UP May 14 23:38:09.859800 systemd-networkd[803]: ens192: Gained carrier May 14 23:38:09.880658 ignition[806]: Ignition 2.20.0 May 14 23:38:09.880938 ignition[806]: Stage: kargs May 14 23:38:09.881041 ignition[806]: no configs at "/usr/lib/ignition/base.d" May 14 23:38:09.881047 ignition[806]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 14 23:38:09.881898 ignition[806]: kargs: kargs passed May 14 23:38:09.881927 ignition[806]: Ignition finished successfully May 14 23:38:09.882838 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 14 23:38:09.883712 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 14 23:38:09.899407 ignition[813]: Ignition 2.20.0 May 14 23:38:09.899414 ignition[813]: Stage: disks May 14 23:38:09.899527 ignition[813]: no configs at "/usr/lib/ignition/base.d" May 14 23:38:09.899533 ignition[813]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 14 23:38:09.900050 ignition[813]: disks: disks passed May 14 23:38:09.900075 ignition[813]: Ignition finished successfully May 14 23:38:09.900826 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 14 23:38:09.901224 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 14 23:38:09.901364 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 14 23:38:09.901556 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 23:38:09.901731 systemd[1]: Reached target sysinit.target - System Initialization. May 14 23:38:09.901913 systemd[1]: Reached target basic.target - Basic System. May 14 23:38:09.902614 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 14 23:38:09.926088 systemd-fsck[821]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 14 23:38:09.927432 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 14 23:38:09.928283 systemd[1]: Mounting sysroot.mount - /sysroot... May 14 23:38:10.054545 kernel: EXT4-fs (sda9): mounted filesystem 81735587-bac5-4d9e-ae49-5642e655af7f r/w with ordered data mode. Quota mode: none. May 14 23:38:10.055012 systemd[1]: Mounted sysroot.mount - /sysroot. May 14 23:38:10.055392 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 14 23:38:10.089387 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 23:38:10.104729 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 14 23:38:10.105007 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 14 23:38:10.105034 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 14 23:38:10.105048 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 14 23:38:10.116273 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 14 23:38:10.117200 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 14 23:38:10.180521 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (829) May 14 23:38:10.187090 kernel: BTRFS info (device sda6): first mount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 14 23:38:10.187126 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 14 23:38:10.187135 kernel: BTRFS info (device sda6): using free space tree May 14 23:38:10.195513 kernel: BTRFS info (device sda6): enabling ssd optimizations May 14 23:38:10.197213 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 23:38:10.211955 initrd-setup-root[853]: cut: /sysroot/etc/passwd: No such file or directory May 14 23:38:10.214368 initrd-setup-root[860]: cut: /sysroot/etc/group: No such file or directory May 14 23:38:10.216844 initrd-setup-root[867]: cut: /sysroot/etc/shadow: No such file or directory May 14 23:38:10.219018 initrd-setup-root[874]: cut: /sysroot/etc/gshadow: No such file or directory May 14 23:38:10.287088 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 14 23:38:10.287875 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 14 23:38:10.289572 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 14 23:38:10.300533 kernel: BTRFS info (device sda6): last unmount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 14 23:38:10.318030 ignition[942]: INFO : Ignition 2.20.0 May 14 23:38:10.318030 ignition[942]: INFO : Stage: mount May 14 23:38:10.318385 ignition[942]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 23:38:10.318385 ignition[942]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 14 23:38:10.319215 ignition[942]: INFO : mount: mount passed May 14 23:38:10.319341 ignition[942]: INFO : Ignition finished successfully May 14 23:38:10.319268 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 14 23:38:10.320024 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 14 23:38:10.320665 systemd[1]: Starting ignition-files.service - Ignition (files)... May 14 23:38:10.412842 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 14 23:38:10.413760 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 23:38:10.431521 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (954) May 14 23:38:10.434157 kernel: BTRFS info (device sda6): first mount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 14 23:38:10.434183 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 14 23:38:10.434191 kernel: BTRFS info (device sda6): using free space tree May 14 23:38:10.438512 kernel: BTRFS info (device sda6): enabling ssd optimizations May 14 23:38:10.439128 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 23:38:10.450674 ignition[971]: INFO : Ignition 2.20.0 May 14 23:38:10.450674 ignition[971]: INFO : Stage: files May 14 23:38:10.451045 ignition[971]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 23:38:10.451045 ignition[971]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 14 23:38:10.451432 ignition[971]: DEBUG : files: compiled without relabeling support, skipping May 14 23:38:10.452242 ignition[971]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 14 23:38:10.452242 ignition[971]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 14 23:38:10.455329 ignition[971]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 14 23:38:10.455466 ignition[971]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 14 23:38:10.455606 ignition[971]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 14 23:38:10.455566 unknown[971]: wrote ssh authorized keys file for user: core May 14 23:38:10.457078 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 14 23:38:10.457239 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 14 23:38:10.496426 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 14 23:38:10.762859 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 14 23:38:10.762859 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 14 23:38:10.763319 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 14 23:38:10.764779 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 23:38:10.764779 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 23:38:10.764779 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 23:38:10.764779 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 May 14 23:38:11.305990 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 14 23:38:11.634675 systemd-networkd[803]: ens192: Gained IPv6LL May 14 23:38:12.453205 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" May 14 23:38:12.453710 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" May 14 23:38:12.453710 ignition[971]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" May 14 23:38:12.453710 ignition[971]: INFO : files: op(c): [started] processing unit "prepare-helm.service" May 14 23:38:12.453710 ignition[971]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 23:38:12.453710 ignition[971]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 23:38:12.453710 ignition[971]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" May 14 23:38:12.453710 ignition[971]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" May 14 23:38:12.453710 ignition[971]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 14 23:38:12.453710 ignition[971]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 14 23:38:12.455175 ignition[971]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" May 14 23:38:12.455175 ignition[971]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" May 14 23:38:12.474814 ignition[971]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" May 14 23:38:12.477115 ignition[971]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 14 23:38:12.477115 ignition[971]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" May 14 23:38:12.477115 ignition[971]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" May 14 23:38:12.477115 ignition[971]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" May 14 23:38:12.477115 ignition[971]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" May 14 23:38:12.477115 ignition[971]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" May 14 23:38:12.477115 ignition[971]: INFO : files: files passed May 14 23:38:12.477115 ignition[971]: INFO : Ignition finished successfully May 14 23:38:12.478118 systemd[1]: Finished ignition-files.service - Ignition (files). May 14 23:38:12.478956 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 14 23:38:12.480570 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 14 23:38:12.490243 systemd[1]: ignition-quench.service: Deactivated successfully. May 14 23:38:12.490318 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 14 23:38:12.493432 initrd-setup-root-after-ignition[1003]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 23:38:12.493432 initrd-setup-root-after-ignition[1003]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 14 23:38:12.494458 initrd-setup-root-after-ignition[1007]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 23:38:12.495568 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 23:38:12.495965 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 14 23:38:12.496603 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 14 23:38:12.523826 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 14 23:38:12.523891 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 14 23:38:12.524188 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 14 23:38:12.524305 systemd[1]: Reached target initrd.target - Initrd Default Target. May 14 23:38:12.524514 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 14 23:38:12.524994 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 14 23:38:12.534379 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 23:38:12.535360 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 14 23:38:12.549394 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 14 23:38:12.549605 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 23:38:12.549886 systemd[1]: Stopped target timers.target - Timer Units. May 14 23:38:12.550083 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 14 23:38:12.550164 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 23:38:12.550522 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 14 23:38:12.550681 systemd[1]: Stopped target basic.target - Basic System. May 14 23:38:12.550861 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 14 23:38:12.551047 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 14 23:38:12.551248 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 14 23:38:12.551457 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 14 23:38:12.551826 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 14 23:38:12.552038 systemd[1]: Stopped target sysinit.target - System Initialization. May 14 23:38:12.552242 systemd[1]: Stopped target local-fs.target - Local File Systems. May 14 23:38:12.552426 systemd[1]: Stopped target swap.target - Swaps. May 14 23:38:12.552592 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 14 23:38:12.552665 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 14 23:38:12.552995 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 14 23:38:12.553161 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 23:38:12.553339 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 14 23:38:12.553389 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 23:38:12.553602 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 14 23:38:12.553665 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 14 23:38:12.553940 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 14 23:38:12.554007 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 14 23:38:12.554228 systemd[1]: Stopped target paths.target - Path Units. May 14 23:38:12.554359 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 14 23:38:12.557532 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 23:38:12.557718 systemd[1]: Stopped target slices.target - Slice Units. May 14 23:38:12.557919 systemd[1]: Stopped target sockets.target - Socket Units. May 14 23:38:12.558099 systemd[1]: iscsid.socket: Deactivated successfully. May 14 23:38:12.558167 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 14 23:38:12.558378 systemd[1]: iscsiuio.socket: Deactivated successfully. May 14 23:38:12.558442 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 23:38:12.558676 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 14 23:38:12.558751 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 23:38:12.558968 systemd[1]: ignition-files.service: Deactivated successfully. May 14 23:38:12.559027 systemd[1]: Stopped ignition-files.service - Ignition (files). May 14 23:38:12.559803 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 14 23:38:12.562220 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 14 23:38:12.562349 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 14 23:38:12.562446 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 14 23:38:12.562696 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 14 23:38:12.562780 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 14 23:38:12.565610 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 14 23:38:12.565657 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 14 23:38:12.574008 ignition[1027]: INFO : Ignition 2.20.0 May 14 23:38:12.574362 ignition[1027]: INFO : Stage: umount May 14 23:38:12.574351 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 14 23:38:12.574805 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 23:38:12.574805 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" May 14 23:38:12.575483 ignition[1027]: INFO : umount: umount passed May 14 23:38:12.575686 ignition[1027]: INFO : Ignition finished successfully May 14 23:38:12.576368 systemd[1]: ignition-mount.service: Deactivated successfully. May 14 23:38:12.576426 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 14 23:38:12.577022 systemd[1]: Stopped target network.target - Network. May 14 23:38:12.577283 systemd[1]: ignition-disks.service: Deactivated successfully. May 14 23:38:12.577460 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 14 23:38:12.577730 systemd[1]: ignition-kargs.service: Deactivated successfully. May 14 23:38:12.577760 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 14 23:38:12.578066 systemd[1]: ignition-setup.service: Deactivated successfully. May 14 23:38:12.578092 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 14 23:38:12.578436 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 14 23:38:12.578461 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 14 23:38:12.578870 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 14 23:38:12.579010 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 14 23:38:12.583474 systemd[1]: systemd-resolved.service: Deactivated successfully. May 14 23:38:12.583544 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 14 23:38:12.585036 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 14 23:38:12.585162 systemd[1]: systemd-networkd.service: Deactivated successfully. May 14 23:38:12.585208 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 14 23:38:12.585929 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 14 23:38:12.586364 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 14 23:38:12.586391 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 14 23:38:12.587105 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 14 23:38:12.587203 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 14 23:38:12.587230 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 23:38:12.587361 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. May 14 23:38:12.587383 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. May 14 23:38:12.587515 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 14 23:38:12.587538 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 14 23:38:12.587709 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 14 23:38:12.587730 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 14 23:38:12.587861 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 14 23:38:12.587882 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 23:38:12.588772 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 23:38:12.589653 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 14 23:38:12.589690 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 14 23:38:12.599607 systemd[1]: network-cleanup.service: Deactivated successfully. May 14 23:38:12.599677 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 14 23:38:12.600825 systemd[1]: systemd-udevd.service: Deactivated successfully. May 14 23:38:12.600900 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 23:38:12.601217 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 14 23:38:12.601242 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 14 23:38:12.601454 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 14 23:38:12.601472 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 14 23:38:12.601645 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 14 23:38:12.601671 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 14 23:38:12.601927 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 14 23:38:12.601951 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 14 23:38:12.602368 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 14 23:38:12.602391 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 23:38:12.603151 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 14 23:38:12.603262 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 14 23:38:12.603288 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 23:38:12.603464 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 14 23:38:12.603488 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 23:38:12.603676 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 14 23:38:12.603698 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 14 23:38:12.603870 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 23:38:12.603891 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 23:38:12.604738 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 14 23:38:12.604772 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 14 23:38:12.612910 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 14 23:38:12.612986 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 14 23:38:12.641775 systemd[1]: sysroot-boot.service: Deactivated successfully. May 14 23:38:12.641849 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 14 23:38:12.642242 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 14 23:38:12.642371 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 14 23:38:12.642404 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 14 23:38:12.643015 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 14 23:38:12.661663 systemd[1]: Switching root. May 14 23:38:12.691854 systemd-journald[218]: Journal stopped May 14 23:38:14.922951 systemd-journald[218]: Received SIGTERM from PID 1 (systemd). May 14 23:38:14.922990 kernel: SELinux: policy capability network_peer_controls=1 May 14 23:38:14.922999 kernel: SELinux: policy capability open_perms=1 May 14 23:38:14.923005 kernel: SELinux: policy capability extended_socket_class=1 May 14 23:38:14.923010 kernel: SELinux: policy capability always_check_network=0 May 14 23:38:14.923017 kernel: SELinux: policy capability cgroup_seclabel=1 May 14 23:38:14.923025 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 14 23:38:14.923030 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 14 23:38:14.923036 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 14 23:38:14.923042 systemd[1]: Successfully loaded SELinux policy in 32.538ms. May 14 23:38:14.923049 kernel: audit: type=1403 audit(1747265893.268:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 14 23:38:14.923056 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.795ms. May 14 23:38:14.923063 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 23:38:14.923070 systemd[1]: Detected virtualization vmware. May 14 23:38:14.923077 systemd[1]: Detected architecture x86-64. May 14 23:38:14.923084 systemd[1]: Detected first boot. May 14 23:38:14.923090 systemd[1]: Initializing machine ID from random generator. May 14 23:38:14.923098 zram_generator::config[1073]: No configuration found. May 14 23:38:14.923199 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc May 14 23:38:14.923211 kernel: Guest personality initialized and is active May 14 23:38:14.923217 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 14 23:38:14.923223 kernel: Initialized host personality May 14 23:38:14.923229 kernel: NET: Registered PF_VSOCK protocol family May 14 23:38:14.923235 systemd[1]: Populated /etc with preset unit settings. May 14 23:38:14.923246 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 14 23:38:14.923253 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" May 14 23:38:14.923260 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 14 23:38:14.923266 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 14 23:38:14.923273 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 14 23:38:14.923279 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 14 23:38:14.923286 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 14 23:38:14.923294 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 14 23:38:14.923301 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 14 23:38:14.923307 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 14 23:38:14.923314 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 14 23:38:14.923321 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 14 23:38:14.923328 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 14 23:38:14.923335 systemd[1]: Created slice user.slice - User and Session Slice. May 14 23:38:14.923342 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 23:38:14.923350 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 23:38:14.923359 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 14 23:38:14.923366 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 14 23:38:14.923373 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 14 23:38:14.923380 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 23:38:14.923387 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 14 23:38:14.923395 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 23:38:14.923403 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 14 23:38:14.923410 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 14 23:38:14.923416 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 14 23:38:14.923423 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 14 23:38:14.923430 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 23:38:14.923437 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 23:38:14.923444 systemd[1]: Reached target slices.target - Slice Units. May 14 23:38:14.923451 systemd[1]: Reached target swap.target - Swaps. May 14 23:38:14.923457 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 14 23:38:14.923465 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 14 23:38:14.923473 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 14 23:38:14.923480 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 23:38:14.923487 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 23:38:14.923495 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 23:38:14.924535 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 14 23:38:14.924551 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 14 23:38:14.924560 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 14 23:38:14.924567 systemd[1]: Mounting media.mount - External Media Directory... May 14 23:38:14.924574 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:14.924581 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 14 23:38:14.924588 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 14 23:38:14.924598 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 14 23:38:14.925498 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 14 23:38:14.925530 systemd[1]: Reached target machines.target - Containers. May 14 23:38:14.925538 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 14 23:38:14.925545 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... May 14 23:38:14.925552 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 23:38:14.925559 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 14 23:38:14.925565 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 23:38:14.925572 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 23:38:14.925582 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 23:38:14.925590 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 14 23:38:14.925597 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 23:38:14.925604 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 14 23:38:14.925611 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 14 23:38:14.925618 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 14 23:38:14.925625 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 14 23:38:14.925632 systemd[1]: Stopped systemd-fsck-usr.service. May 14 23:38:14.925641 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 23:38:14.925649 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 23:38:14.925655 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 23:38:14.925662 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 14 23:38:14.925669 kernel: fuse: init (API version 7.39) May 14 23:38:14.925676 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 14 23:38:14.925710 systemd-journald[1156]: Collecting audit messages is disabled. May 14 23:38:14.925729 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 14 23:38:14.925737 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 23:38:14.925745 systemd-journald[1156]: Journal started May 14 23:38:14.925762 systemd-journald[1156]: Runtime Journal (/run/log/journal/f048c5a2b6884c7694ecc360289395e8) is 4.8M, max 38.6M, 33.7M free. May 14 23:38:14.740387 systemd[1]: Queued start job for default target multi-user.target. May 14 23:38:14.748695 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 14 23:38:14.748928 systemd[1]: systemd-journald.service: Deactivated successfully. May 14 23:38:14.926278 jq[1143]: true May 14 23:38:14.932399 systemd[1]: verity-setup.service: Deactivated successfully. May 14 23:38:14.932437 systemd[1]: Stopped verity-setup.service. May 14 23:38:14.932447 kernel: loop: module loaded May 14 23:38:14.932455 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:14.932468 systemd[1]: Started systemd-journald.service - Journal Service. May 14 23:38:14.935146 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 14 23:38:14.935766 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 14 23:38:14.935915 systemd[1]: Mounted media.mount - External Media Directory. May 14 23:38:14.936118 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 14 23:38:14.936853 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 14 23:38:14.937005 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 14 23:38:14.938787 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 23:38:14.939063 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 14 23:38:14.939168 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 14 23:38:14.939402 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 23:38:14.939492 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 23:38:14.939720 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 23:38:14.939815 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 23:38:14.940041 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 14 23:38:14.940134 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 14 23:38:14.940353 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 23:38:14.940442 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 23:38:14.940988 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 14 23:38:14.948294 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 23:38:14.948843 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 14 23:38:14.952305 jq[1181]: true May 14 23:38:14.954491 systemd[1]: Reached target network-pre.target - Preparation for Network. May 14 23:38:14.958148 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 14 23:38:14.961321 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 14 23:38:14.961456 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 14 23:38:14.961482 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 23:38:14.962196 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 14 23:38:14.968167 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 14 23:38:14.970991 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 14 23:38:14.971168 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 23:38:14.991835 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 14 23:38:14.998236 kernel: ACPI: bus type drm_connector registered May 14 23:38:14.997899 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 14 23:38:14.998051 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 23:38:15.000614 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 14 23:38:15.000761 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 23:38:15.003540 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 23:38:15.006603 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 14 23:38:15.010701 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 23:38:15.012415 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 14 23:38:15.015782 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 23:38:15.015953 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 23:38:15.016582 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 14 23:38:15.018296 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 14 23:38:15.022840 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 14 23:38:15.023132 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 14 23:38:15.036217 systemd-journald[1156]: Time spent on flushing to /var/log/journal/f048c5a2b6884c7694ecc360289395e8 is 202.201ms for 1849 entries. May 14 23:38:15.036217 systemd-journald[1156]: System Journal (/var/log/journal/f048c5a2b6884c7694ecc360289395e8) is 8M, max 584.8M, 576.8M free. May 14 23:38:15.349166 systemd-journald[1156]: Received client request to flush runtime journal. May 14 23:38:15.349209 kernel: loop0: detected capacity change from 0 to 151640 May 14 23:38:15.159912 ignition[1202]: Ignition 2.20.0 May 14 23:38:15.060517 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 14 23:38:15.160069 ignition[1202]: deleting config from guestinfo properties May 14 23:38:15.061275 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 14 23:38:15.236961 ignition[1202]: Successfully deleted config May 14 23:38:15.071383 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 14 23:38:15.105569 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 23:38:15.119901 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 23:38:15.123165 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 14 23:38:15.144787 udevadm[1232]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 14 23:38:15.146060 systemd-tmpfiles[1217]: ACLs are not supported, ignoring. May 14 23:38:15.146074 systemd-tmpfiles[1217]: ACLs are not supported, ignoring. May 14 23:38:15.150155 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 23:38:15.151232 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 14 23:38:15.239780 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). May 14 23:38:15.352670 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 14 23:38:15.371533 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 14 23:38:15.372415 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 14 23:38:15.387654 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 14 23:38:15.391649 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 23:38:15.397539 kernel: loop1: detected capacity change from 0 to 109808 May 14 23:38:15.411367 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. May 14 23:38:15.411645 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. May 14 23:38:15.415077 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 23:38:15.434561 kernel: loop2: detected capacity change from 0 to 2960 May 14 23:38:15.463526 kernel: loop3: detected capacity change from 0 to 205544 May 14 23:38:15.515521 kernel: loop4: detected capacity change from 0 to 151640 May 14 23:38:15.549528 kernel: loop5: detected capacity change from 0 to 109808 May 14 23:38:15.579612 kernel: loop6: detected capacity change from 0 to 2960 May 14 23:38:15.588527 kernel: loop7: detected capacity change from 0 to 205544 May 14 23:38:15.696404 (sd-merge)[1256]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. May 14 23:38:15.696740 (sd-merge)[1256]: Merged extensions into '/usr'. May 14 23:38:15.701935 systemd[1]: Reload requested from client PID 1216 ('systemd-sysext') (unit systemd-sysext.service)... May 14 23:38:15.701945 systemd[1]: Reloading... May 14 23:38:15.779567 zram_generator::config[1284]: No configuration found. May 14 23:38:15.859188 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 14 23:38:15.880340 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 23:38:15.937454 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 14 23:38:15.937901 systemd[1]: Reloading finished in 235 ms. May 14 23:38:15.962158 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 14 23:38:15.962527 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 14 23:38:15.977551 systemd[1]: Starting ensure-sysext.service... May 14 23:38:15.980425 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 23:38:15.983599 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 23:38:16.003496 systemd[1]: Reload requested from client PID 1340 ('systemctl') (unit ensure-sysext.service)... May 14 23:38:16.003514 systemd[1]: Reloading... May 14 23:38:16.011218 systemd-tmpfiles[1341]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 14 23:38:16.011389 systemd-tmpfiles[1341]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 14 23:38:16.011915 systemd-tmpfiles[1341]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 14 23:38:16.012084 systemd-tmpfiles[1341]: ACLs are not supported, ignoring. May 14 23:38:16.012123 systemd-tmpfiles[1341]: ACLs are not supported, ignoring. May 14 23:38:16.016576 systemd-udevd[1342]: Using default interface naming scheme 'v255'. May 14 23:38:16.041065 systemd-tmpfiles[1341]: Detected autofs mount point /boot during canonicalization of boot. May 14 23:38:16.041165 systemd-tmpfiles[1341]: Skipping /boot May 14 23:38:16.049056 systemd-tmpfiles[1341]: Detected autofs mount point /boot during canonicalization of boot. May 14 23:38:16.049164 systemd-tmpfiles[1341]: Skipping /boot May 14 23:38:16.059550 zram_generator::config[1371]: No configuration found. May 14 23:38:16.127937 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 14 23:38:16.149031 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 23:38:16.202420 ldconfig[1211]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 14 23:38:16.231828 systemd[1]: Reloading finished in 228 ms. May 14 23:38:16.235404 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 23:38:16.235787 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 14 23:38:16.250899 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 23:38:16.257523 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 14 23:38:16.266280 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 14 23:38:16.267334 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:16.268547 kernel: ACPI: button: Power Button [PWRF] May 14 23:38:16.270357 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 23:38:16.272191 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 14 23:38:16.273740 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 23:38:16.278546 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 23:38:16.283201 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 23:38:16.283406 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 23:38:16.283486 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 23:38:16.285988 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 14 23:38:16.288519 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 23:38:16.291632 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 23:38:16.293915 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 14 23:38:16.296568 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:16.297579 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 23:38:16.297695 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 23:38:16.300702 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1438) May 14 23:38:16.306798 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 14 23:38:16.309668 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:16.317335 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 23:38:16.317498 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 23:38:16.317713 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 23:38:16.317783 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:16.338730 systemd[1]: Finished ensure-sysext.service. May 14 23:38:16.341080 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:16.346676 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 23:38:16.347651 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 23:38:16.347678 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 23:38:16.375705 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 14 23:38:16.375858 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 23:38:16.377563 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 14 23:38:16.377857 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 23:38:16.377972 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 23:38:16.378209 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 23:38:16.378311 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 23:38:16.378724 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 23:38:16.378825 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 23:38:16.387637 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 23:38:16.387785 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 23:38:16.396151 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. May 14 23:38:16.403969 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 14 23:38:16.404396 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 14 23:38:16.410153 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 14 23:38:16.412571 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 23:38:16.412623 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 23:38:16.413842 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 14 23:38:16.413956 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 14 23:38:16.423269 augenrules[1501]: No rules May 14 23:38:16.423982 systemd[1]: audit-rules.service: Deactivated successfully. May 14 23:38:16.424134 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 23:38:16.433522 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! May 14 23:38:16.433965 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 14 23:38:16.442228 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 14 23:38:16.460596 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 14 23:38:16.494532 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 May 14 23:38:16.528094 (udev-worker)[1434]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. May 14 23:38:16.531529 kernel: mousedev: PS/2 mouse device common for all mice May 14 23:38:16.532627 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 23:38:16.549998 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 14 23:38:16.550202 systemd[1]: Reached target time-set.target - System Time Set. May 14 23:38:16.558211 systemd-networkd[1463]: lo: Link UP May 14 23:38:16.558422 systemd-resolved[1464]: Positive Trust Anchors: May 14 23:38:16.558428 systemd-resolved[1464]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 23:38:16.558452 systemd-resolved[1464]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 23:38:16.559594 systemd-networkd[1463]: lo: Gained carrier May 14 23:38:16.561001 systemd-networkd[1463]: Enumeration completed May 14 23:38:16.561119 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 23:38:16.561380 systemd-networkd[1463]: ens192: Configuring with /etc/systemd/network/00-vmware.network. May 14 23:38:16.563133 systemd-resolved[1464]: Defaulting to hostname 'linux'. May 14 23:38:16.565340 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated May 14 23:38:16.565572 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps May 14 23:38:16.566940 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 14 23:38:16.567250 systemd-networkd[1463]: ens192: Link UP May 14 23:38:16.567567 systemd-networkd[1463]: ens192: Gained carrier May 14 23:38:16.569767 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 14 23:38:16.570178 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 23:38:16.570318 systemd[1]: Reached target network.target - Network. May 14 23:38:16.570397 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 23:38:16.572218 systemd-timesyncd[1487]: Network configuration changed, trying to establish connection. May 14 23:38:16.579764 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 14 23:38:16.582550 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 14 23:38:16.589061 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 14 23:38:16.596172 lvm[1528]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 14 23:38:16.623454 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 14 23:38:16.627172 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 23:38:16.628342 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 14 23:38:16.628680 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 23:38:16.629146 systemd[1]: Reached target sysinit.target - System Initialization. May 14 23:38:16.629566 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 14 23:38:16.629714 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 14 23:38:16.629945 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 14 23:38:16.630121 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 14 23:38:16.630253 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 14 23:38:16.630376 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 14 23:38:16.630394 systemd[1]: Reached target paths.target - Path Units. May 14 23:38:16.630498 systemd[1]: Reached target timers.target - Timer Units. May 14 23:38:16.631442 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 14 23:38:16.635367 systemd[1]: Starting docker.socket - Docker Socket for the API... May 14 23:38:16.637362 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 14 23:38:16.637780 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 14 23:38:16.637919 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 14 23:38:16.638534 lvm[1533]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 14 23:38:16.639726 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 14 23:38:16.640114 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 14 23:38:16.640769 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 14 23:38:16.640930 systemd[1]: Reached target sockets.target - Socket Units. May 14 23:38:16.641038 systemd[1]: Reached target basic.target - Basic System. May 14 23:38:16.641170 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 14 23:38:16.641189 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 14 23:38:16.641983 systemd[1]: Starting containerd.service - containerd container runtime... May 14 23:38:16.645732 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 14 23:38:16.647587 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 14 23:38:16.650184 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 14 23:38:16.650325 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 14 23:38:16.653066 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 14 23:38:16.659363 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 14 23:38:16.661324 jq[1539]: false May 14 23:38:16.662709 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 14 23:38:16.666658 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 14 23:38:16.679002 systemd[1]: Starting systemd-logind.service - User Login Management... May 14 23:38:16.684950 extend-filesystems[1540]: Found loop4 May 14 23:38:16.684950 extend-filesystems[1540]: Found loop5 May 14 23:38:16.684950 extend-filesystems[1540]: Found loop6 May 14 23:38:16.684950 extend-filesystems[1540]: Found loop7 May 14 23:38:16.684950 extend-filesystems[1540]: Found sda May 14 23:38:16.684950 extend-filesystems[1540]: Found sda1 May 14 23:38:16.684950 extend-filesystems[1540]: Found sda2 May 14 23:38:16.684950 extend-filesystems[1540]: Found sda3 May 14 23:38:16.684950 extend-filesystems[1540]: Found usr May 14 23:38:16.684950 extend-filesystems[1540]: Found sda4 May 14 23:38:16.684950 extend-filesystems[1540]: Found sda6 May 14 23:38:16.684950 extend-filesystems[1540]: Found sda7 May 14 23:38:16.684950 extend-filesystems[1540]: Found sda9 May 14 23:38:16.684950 extend-filesystems[1540]: Checking size of /dev/sda9 May 14 23:38:16.679670 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 14 23:38:16.680210 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 14 23:38:16.681631 systemd[1]: Starting update-engine.service - Update Engine... May 14 23:38:16.683960 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 14 23:38:16.688493 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... May 14 23:38:16.692701 extend-filesystems[1540]: Old size kept for /dev/sda9 May 14 23:38:16.692892 extend-filesystems[1540]: Found sr0 May 14 23:38:16.695551 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 14 23:38:16.698325 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 14 23:38:16.698458 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 14 23:38:16.698700 systemd[1]: extend-filesystems.service: Deactivated successfully. May 14 23:38:16.698816 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 14 23:38:16.700805 systemd[1]: motdgen.service: Deactivated successfully. May 14 23:38:16.700987 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 14 23:38:16.706458 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 14 23:38:16.706823 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 14 23:38:16.714892 jq[1555]: true May 14 23:38:16.724689 update_engine[1554]: I20250514 23:38:16.720618 1554 main.cc:92] Flatcar Update Engine starting May 14 23:38:16.734330 dbus-daemon[1538]: [system] SELinux support is enabled May 14 23:38:16.734481 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 14 23:38:16.739551 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 14 23:38:16.739581 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 14 23:38:16.741183 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 14 23:38:16.741195 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 14 23:38:16.748959 update_engine[1554]: I20250514 23:38:16.748920 1554 update_check_scheduler.cc:74] Next update check in 6m6s May 14 23:38:16.749597 systemd[1]: Started update-engine.service - Update Engine. May 14 23:38:16.749977 (ntainerd)[1574]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 14 23:38:16.762065 jq[1573]: true May 14 23:38:16.770393 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1436) May 14 23:38:16.777409 systemd-logind[1552]: Watching system buttons on /dev/input/event1 (Power Button) May 14 23:38:16.778236 systemd-logind[1552]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 14 23:38:16.779751 systemd-logind[1552]: New seat seat0. May 14 23:38:16.780436 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 14 23:38:16.780982 systemd[1]: Started systemd-logind.service - User Login Management. May 14 23:38:16.781951 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. May 14 23:38:16.807972 tar[1564]: linux-amd64/helm May 14 23:38:16.808088 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... May 14 23:38:16.835176 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. May 14 23:38:16.873134 unknown[1580]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath May 14 23:38:16.876992 unknown[1580]: Core dump limit set to -1 May 14 23:38:16.978547 bash[1601]: Updated "/home/core/.ssh/authorized_keys" May 14 23:38:16.979671 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 14 23:38:16.981062 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 14 23:38:17.039072 locksmithd[1578]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 14 23:38:17.047273 sshd_keygen[1565]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 14 23:38:17.098057 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 14 23:38:17.101735 systemd[1]: Starting issuegen.service - Generate /run/issue... May 14 23:38:17.118735 containerd[1574]: time="2025-05-14T23:38:17Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 14 23:38:17.119551 containerd[1574]: time="2025-05-14T23:38:17.119534355Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 14 23:38:17.122815 systemd[1]: issuegen.service: Deactivated successfully. May 14 23:38:17.122960 systemd[1]: Finished issuegen.service - Generate /run/issue. May 14 23:38:17.126818 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 14 23:38:17.134581 containerd[1574]: time="2025-05-14T23:38:17.134539829Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="33.166µs" May 14 23:38:17.134581 containerd[1574]: time="2025-05-14T23:38:17.134568323Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 14 23:38:17.134581 containerd[1574]: time="2025-05-14T23:38:17.134584817Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 14 23:38:17.134697 containerd[1574]: time="2025-05-14T23:38:17.134687832Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 14 23:38:17.134714 containerd[1574]: time="2025-05-14T23:38:17.134701744Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 14 23:38:17.134732 containerd[1574]: time="2025-05-14T23:38:17.134717489Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 23:38:17.134773 containerd[1574]: time="2025-05-14T23:38:17.134760349Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 23:38:17.134773 containerd[1574]: time="2025-05-14T23:38:17.134771616Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 23:38:17.135938 containerd[1574]: time="2025-05-14T23:38:17.135761256Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 23:38:17.135938 containerd[1574]: time="2025-05-14T23:38:17.135934527Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 23:38:17.135996 containerd[1574]: time="2025-05-14T23:38:17.135946396Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 23:38:17.135996 containerd[1574]: time="2025-05-14T23:38:17.135952190Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 14 23:38:17.136037 containerd[1574]: time="2025-05-14T23:38:17.136013053Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 14 23:38:17.136196 containerd[1574]: time="2025-05-14T23:38:17.136180297Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 23:38:17.136220 containerd[1574]: time="2025-05-14T23:38:17.136207141Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 23:38:17.136220 containerd[1574]: time="2025-05-14T23:38:17.136216427Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 14 23:38:17.136275 containerd[1574]: time="2025-05-14T23:38:17.136242215Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 14 23:38:17.136549 containerd[1574]: time="2025-05-14T23:38:17.136423629Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 14 23:38:17.136549 containerd[1574]: time="2025-05-14T23:38:17.136469880Z" level=info msg="metadata content store policy set" policy=shared May 14 23:38:17.139621 containerd[1574]: time="2025-05-14T23:38:17.139596337Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139739261Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139772429Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139784157Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139793582Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139804496Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139816938Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139826033Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139835779Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139847678Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139854519Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139862327Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139955290Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139971130Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 14 23:38:17.140958 containerd[1574]: time="2025-05-14T23:38:17.139981553Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.139988817Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.139995217Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140002901Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140009408Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140015373Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140022157Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140028998Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140035608Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140079215Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140090928Z" level=info msg="Start snapshots syncer" May 14 23:38:17.141235 containerd[1574]: time="2025-05-14T23:38:17.140109211Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 14 23:38:17.141419 containerd[1574]: time="2025-05-14T23:38:17.140283500Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 14 23:38:17.141419 containerd[1574]: time="2025-05-14T23:38:17.140317290Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140371874Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140438953Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140455131Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140463240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140476016Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140488621Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140513239Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140526261Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140541624Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140551779Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140561343Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140594190Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140605336Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 23:38:17.141542 containerd[1574]: time="2025-05-14T23:38:17.140610947Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 23:38:17.141762 containerd[1574]: time="2025-05-14T23:38:17.140617289Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 23:38:17.141762 containerd[1574]: time="2025-05-14T23:38:17.140625531Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 14 23:38:17.141762 containerd[1574]: time="2025-05-14T23:38:17.140631942Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 14 23:38:17.141762 containerd[1574]: time="2025-05-14T23:38:17.140638232Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 14 23:38:17.143126 containerd[1574]: time="2025-05-14T23:38:17.142490381Z" level=info msg="runtime interface created" May 14 23:38:17.143126 containerd[1574]: time="2025-05-14T23:38:17.142519936Z" level=info msg="created NRI interface" May 14 23:38:17.143126 containerd[1574]: time="2025-05-14T23:38:17.142536420Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 14 23:38:17.143126 containerd[1574]: time="2025-05-14T23:38:17.142551032Z" level=info msg="Connect containerd service" May 14 23:38:17.143126 containerd[1574]: time="2025-05-14T23:38:17.142581242Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 14 23:38:17.143844 containerd[1574]: time="2025-05-14T23:38:17.143826958Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 14 23:38:17.149883 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 14 23:38:17.153363 systemd[1]: Started getty@tty1.service - Getty on tty1. May 14 23:38:17.157321 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 14 23:38:17.157563 systemd[1]: Reached target getty.target - Login Prompts. May 14 23:38:17.316344 containerd[1574]: time="2025-05-14T23:38:17.316276031Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 14 23:38:17.316344 containerd[1574]: time="2025-05-14T23:38:17.316328455Z" level=info msg=serving... address=/run/containerd/containerd.sock May 14 23:38:17.316434 containerd[1574]: time="2025-05-14T23:38:17.316352591Z" level=info msg="Start subscribing containerd event" May 14 23:38:17.316434 containerd[1574]: time="2025-05-14T23:38:17.316373984Z" level=info msg="Start recovering state" May 14 23:38:17.316464 containerd[1574]: time="2025-05-14T23:38:17.316447584Z" level=info msg="Start event monitor" May 14 23:38:17.316464 containerd[1574]: time="2025-05-14T23:38:17.316460088Z" level=info msg="Start cni network conf syncer for default" May 14 23:38:17.316493 containerd[1574]: time="2025-05-14T23:38:17.316468696Z" level=info msg="Start streaming server" May 14 23:38:17.316493 containerd[1574]: time="2025-05-14T23:38:17.316479645Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 14 23:38:17.316493 containerd[1574]: time="2025-05-14T23:38:17.316486324Z" level=info msg="runtime interface starting up..." May 14 23:38:17.316493 containerd[1574]: time="2025-05-14T23:38:17.316491531Z" level=info msg="starting plugins..." May 14 23:38:17.319206 containerd[1574]: time="2025-05-14T23:38:17.319185831Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 14 23:38:17.319566 containerd[1574]: time="2025-05-14T23:38:17.319553931Z" level=info msg="containerd successfully booted in 0.201139s" May 14 23:38:17.319746 systemd[1]: Started containerd.service - containerd container runtime. May 14 23:38:17.340998 tar[1564]: linux-amd64/LICENSE May 14 23:38:17.341269 tar[1564]: linux-amd64/README.md May 14 23:38:17.356573 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 14 23:38:17.906596 systemd-networkd[1463]: ens192: Gained IPv6LL May 14 23:38:17.906996 systemd-timesyncd[1487]: Network configuration changed, trying to establish connection. May 14 23:38:17.907843 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 14 23:38:17.908548 systemd[1]: Reached target network-online.target - Network is Online. May 14 23:38:17.909658 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... May 14 23:38:17.923529 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:38:17.926459 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 14 23:38:17.941920 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 14 23:38:17.962617 systemd[1]: coreos-metadata.service: Deactivated successfully. May 14 23:38:17.962842 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. May 14 23:38:17.963312 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 14 23:38:19.427633 systemd-timesyncd[1487]: Network configuration changed, trying to establish connection. May 14 23:38:19.455128 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:38:19.455693 systemd[1]: Reached target multi-user.target - Multi-User System. May 14 23:38:19.456080 systemd[1]: Startup finished in 969ms (kernel) + 6.641s (initrd) + 6.218s (userspace) = 13.830s. May 14 23:38:19.461136 (kubelet)[1727]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 23:38:19.574334 login[1674]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 23:38:19.575307 login[1677]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 23:38:19.580926 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 14 23:38:19.582741 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 14 23:38:19.589338 systemd-logind[1552]: New session 2 of user core. May 14 23:38:19.592461 systemd-logind[1552]: New session 1 of user core. May 14 23:38:19.599277 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 14 23:38:19.601389 systemd[1]: Starting user@500.service - User Manager for UID 500... May 14 23:38:19.614982 (systemd)[1734]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 14 23:38:19.616569 systemd-logind[1552]: New session c1 of user core. May 14 23:38:19.720823 systemd[1734]: Queued start job for default target default.target. May 14 23:38:19.730603 systemd[1734]: Created slice app.slice - User Application Slice. May 14 23:38:19.730667 systemd[1734]: Reached target paths.target - Paths. May 14 23:38:19.730742 systemd[1734]: Reached target timers.target - Timers. May 14 23:38:19.733548 systemd[1734]: Starting dbus.socket - D-Bus User Message Bus Socket... May 14 23:38:19.738430 systemd[1734]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 14 23:38:19.738519 systemd[1734]: Reached target sockets.target - Sockets. May 14 23:38:19.738588 systemd[1734]: Reached target basic.target - Basic System. May 14 23:38:19.738613 systemd[1734]: Reached target default.target - Main User Target. May 14 23:38:19.738629 systemd[1734]: Startup finished in 118ms. May 14 23:38:19.738780 systemd[1]: Started user@500.service - User Manager for UID 500. May 14 23:38:19.740079 systemd[1]: Started session-1.scope - Session 1 of User core. May 14 23:38:19.741129 systemd[1]: Started session-2.scope - Session 2 of User core. May 14 23:38:20.542295 kubelet[1727]: E0514 23:38:20.542255 1727 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 23:38:20.543483 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 23:38:20.543607 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 23:38:20.543997 systemd[1]: kubelet.service: Consumed 676ms CPU time, 237.6M memory peak. May 14 23:38:30.639434 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 14 23:38:30.640579 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:38:30.990406 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:38:31.001702 (kubelet)[1776]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 23:38:31.046025 kubelet[1776]: E0514 23:38:31.045987 1776 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 23:38:31.048811 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 23:38:31.048903 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 23:38:31.049099 systemd[1]: kubelet.service: Consumed 85ms CPU time, 97.8M memory peak. May 14 23:38:41.139375 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 14 23:38:41.140595 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:38:41.485213 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:38:41.487915 (kubelet)[1792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 23:38:41.512404 kubelet[1792]: E0514 23:38:41.512344 1792 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 23:38:41.513795 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 23:38:41.513924 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 23:38:41.514245 systemd[1]: kubelet.service: Consumed 94ms CPU time, 95.2M memory peak. May 14 23:38:46.937734 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 14 23:38:46.938429 systemd[1]: Started sshd@0-139.178.70.107:22-147.75.109.163:38468.service - OpenSSH per-connection server daemon (147.75.109.163:38468). May 14 23:38:46.977893 sshd[1800]: Accepted publickey for core from 147.75.109.163 port 38468 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:38:46.978578 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:38:46.981611 systemd-logind[1552]: New session 3 of user core. May 14 23:38:46.988594 systemd[1]: Started session-3.scope - Session 3 of User core. May 14 23:38:47.044630 systemd[1]: Started sshd@1-139.178.70.107:22-147.75.109.163:38484.service - OpenSSH per-connection server daemon (147.75.109.163:38484). May 14 23:38:47.077485 sshd[1805]: Accepted publickey for core from 147.75.109.163 port 38484 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:38:47.078220 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:38:47.080984 systemd-logind[1552]: New session 4 of user core. May 14 23:38:47.088613 systemd[1]: Started session-4.scope - Session 4 of User core. May 14 23:38:47.136523 sshd[1807]: Connection closed by 147.75.109.163 port 38484 May 14 23:38:47.137273 sshd-session[1805]: pam_unix(sshd:session): session closed for user core May 14 23:38:47.144165 systemd[1]: sshd@1-139.178.70.107:22-147.75.109.163:38484.service: Deactivated successfully. May 14 23:38:47.145238 systemd[1]: session-4.scope: Deactivated successfully. May 14 23:38:47.146273 systemd-logind[1552]: Session 4 logged out. Waiting for processes to exit. May 14 23:38:47.147274 systemd[1]: Started sshd@2-139.178.70.107:22-147.75.109.163:38486.service - OpenSSH per-connection server daemon (147.75.109.163:38486). May 14 23:38:47.149110 systemd-logind[1552]: Removed session 4. May 14 23:38:47.188936 sshd[1812]: Accepted publickey for core from 147.75.109.163 port 38486 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:38:47.189899 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:38:47.193936 systemd-logind[1552]: New session 5 of user core. May 14 23:38:47.199608 systemd[1]: Started session-5.scope - Session 5 of User core. May 14 23:38:47.245584 sshd[1815]: Connection closed by 147.75.109.163 port 38486 May 14 23:38:47.245941 sshd-session[1812]: pam_unix(sshd:session): session closed for user core May 14 23:38:47.259074 systemd[1]: sshd@2-139.178.70.107:22-147.75.109.163:38486.service: Deactivated successfully. May 14 23:38:47.260124 systemd[1]: session-5.scope: Deactivated successfully. May 14 23:38:47.260729 systemd-logind[1552]: Session 5 logged out. Waiting for processes to exit. May 14 23:38:47.261882 systemd[1]: Started sshd@3-139.178.70.107:22-147.75.109.163:38488.service - OpenSSH per-connection server daemon (147.75.109.163:38488). May 14 23:38:47.263782 systemd-logind[1552]: Removed session 5. May 14 23:38:47.305254 sshd[1820]: Accepted publickey for core from 147.75.109.163 port 38488 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:38:47.306064 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:38:47.309552 systemd-logind[1552]: New session 6 of user core. May 14 23:38:47.318592 systemd[1]: Started session-6.scope - Session 6 of User core. May 14 23:38:47.367758 sshd[1823]: Connection closed by 147.75.109.163 port 38488 May 14 23:38:47.368076 sshd-session[1820]: pam_unix(sshd:session): session closed for user core May 14 23:38:47.385497 systemd[1]: sshd@3-139.178.70.107:22-147.75.109.163:38488.service: Deactivated successfully. May 14 23:38:47.386621 systemd[1]: session-6.scope: Deactivated successfully. May 14 23:38:47.387677 systemd-logind[1552]: Session 6 logged out. Waiting for processes to exit. May 14 23:38:47.389346 systemd[1]: Started sshd@4-139.178.70.107:22-147.75.109.163:38504.service - OpenSSH per-connection server daemon (147.75.109.163:38504). May 14 23:38:47.390040 systemd-logind[1552]: Removed session 6. May 14 23:38:47.423804 sshd[1828]: Accepted publickey for core from 147.75.109.163 port 38504 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:38:47.424871 sshd-session[1828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:38:47.428091 systemd-logind[1552]: New session 7 of user core. May 14 23:38:47.435691 systemd[1]: Started session-7.scope - Session 7 of User core. May 14 23:38:47.493392 sudo[1832]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 14 23:38:47.493636 sudo[1832]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 23:38:47.507143 sudo[1832]: pam_unix(sudo:session): session closed for user root May 14 23:38:47.508242 sshd[1831]: Connection closed by 147.75.109.163 port 38504 May 14 23:38:47.508167 sshd-session[1828]: pam_unix(sshd:session): session closed for user core May 14 23:38:47.518997 systemd[1]: sshd@4-139.178.70.107:22-147.75.109.163:38504.service: Deactivated successfully. May 14 23:38:47.520223 systemd[1]: session-7.scope: Deactivated successfully. May 14 23:38:47.521553 systemd-logind[1552]: Session 7 logged out. Waiting for processes to exit. May 14 23:38:47.522698 systemd[1]: Started sshd@5-139.178.70.107:22-147.75.109.163:38516.service - OpenSSH per-connection server daemon (147.75.109.163:38516). May 14 23:38:47.523867 systemd-logind[1552]: Removed session 7. May 14 23:38:47.566921 sshd[1837]: Accepted publickey for core from 147.75.109.163 port 38516 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:38:47.567680 sshd-session[1837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:38:47.571171 systemd-logind[1552]: New session 8 of user core. May 14 23:38:47.578612 systemd[1]: Started session-8.scope - Session 8 of User core. May 14 23:38:47.628631 sudo[1842]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 14 23:38:47.628877 sudo[1842]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 23:38:47.631472 sudo[1842]: pam_unix(sudo:session): session closed for user root May 14 23:38:47.635151 sudo[1841]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 14 23:38:47.635325 sudo[1841]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 23:38:47.643459 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 23:38:47.669906 augenrules[1864]: No rules May 14 23:38:47.670215 systemd[1]: audit-rules.service: Deactivated successfully. May 14 23:38:47.670345 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 23:38:47.671042 sudo[1841]: pam_unix(sudo:session): session closed for user root May 14 23:38:47.671732 sshd[1840]: Connection closed by 147.75.109.163 port 38516 May 14 23:38:47.672013 sshd-session[1837]: pam_unix(sshd:session): session closed for user core May 14 23:38:47.678288 systemd[1]: sshd@5-139.178.70.107:22-147.75.109.163:38516.service: Deactivated successfully. May 14 23:38:47.679651 systemd[1]: session-8.scope: Deactivated successfully. May 14 23:38:47.680276 systemd-logind[1552]: Session 8 logged out. Waiting for processes to exit. May 14 23:38:47.681468 systemd[1]: Started sshd@6-139.178.70.107:22-147.75.109.163:38526.service - OpenSSH per-connection server daemon (147.75.109.163:38526). May 14 23:38:47.682234 systemd-logind[1552]: Removed session 8. May 14 23:38:47.717794 sshd[1872]: Accepted publickey for core from 147.75.109.163 port 38526 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:38:47.718539 sshd-session[1872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:38:47.721143 systemd-logind[1552]: New session 9 of user core. May 14 23:38:47.729610 systemd[1]: Started session-9.scope - Session 9 of User core. May 14 23:38:47.779021 sudo[1876]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 14 23:38:47.779243 sudo[1876]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 23:38:48.118688 systemd[1]: Starting docker.service - Docker Application Container Engine... May 14 23:38:48.130784 (dockerd)[1894]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 14 23:38:48.386801 dockerd[1894]: time="2025-05-14T23:38:48.386724305Z" level=info msg="Starting up" May 14 23:38:48.388074 dockerd[1894]: time="2025-05-14T23:38:48.388043035Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 14 23:38:48.405289 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport226099424-merged.mount: Deactivated successfully. May 14 23:38:48.425210 dockerd[1894]: time="2025-05-14T23:38:48.425186139Z" level=info msg="Loading containers: start." May 14 23:38:48.535145 kernel: Initializing XFRM netlink socket May 14 23:38:48.534910 systemd-timesyncd[1487]: Network configuration changed, trying to establish connection. May 14 23:40:09.693230 systemd-timesyncd[1487]: Contacted time server 23.186.168.129:123 (2.flatcar.pool.ntp.org). May 14 23:40:09.693263 systemd-timesyncd[1487]: Initial clock synchronization to Wed 2025-05-14 23:40:09.693072 UTC. May 14 23:40:09.693479 systemd-resolved[1464]: Clock change detected. Flushing caches. May 14 23:40:09.698660 systemd-networkd[1463]: docker0: Link UP May 14 23:40:09.731059 dockerd[1894]: time="2025-05-14T23:40:09.731033753Z" level=info msg="Loading containers: done." May 14 23:40:09.740190 dockerd[1894]: time="2025-05-14T23:40:09.740163794Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 14 23:40:09.740270 dockerd[1894]: time="2025-05-14T23:40:09.740218795Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 14 23:40:09.740293 dockerd[1894]: time="2025-05-14T23:40:09.740276282Z" level=info msg="Daemon has completed initialization" May 14 23:40:09.755301 dockerd[1894]: time="2025-05-14T23:40:09.755125545Z" level=info msg="API listen on /run/docker.sock" May 14 23:40:09.755281 systemd[1]: Started docker.service - Docker Application Container Engine. May 14 23:40:10.490039 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1089999528-merged.mount: Deactivated successfully. May 14 23:40:10.977449 containerd[1574]: time="2025-05-14T23:40:10.977110735Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 14 23:40:11.605856 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2628753656.mount: Deactivated successfully. May 14 23:40:12.637783 containerd[1574]: time="2025-05-14T23:40:12.637752315Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:12.638437 containerd[1574]: time="2025-05-14T23:40:12.638401021Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=27960987" May 14 23:40:12.639027 containerd[1574]: time="2025-05-14T23:40:12.638805578Z" level=info msg="ImageCreate event name:\"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:12.640133 containerd[1574]: time="2025-05-14T23:40:12.640107405Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:12.640916 containerd[1574]: time="2025-05-14T23:40:12.640646950Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"27957787\" in 1.663510579s" May 14 23:40:12.640916 containerd[1574]: time="2025-05-14T23:40:12.640668287Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:e6d208e868a9ca7f89efcb0d5bddc55a62df551cb4fb39c5099a2fe7b0e33adc\"" May 14 23:40:12.641840 containerd[1574]: time="2025-05-14T23:40:12.641819278Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 14 23:40:12.728238 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 14 23:40:12.729378 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:40:12.815672 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:40:12.818193 (kubelet)[2157]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 23:40:12.881851 kubelet[2157]: E0514 23:40:12.881813 2157 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 23:40:12.883515 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 23:40:12.883630 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 23:40:12.884128 systemd[1]: kubelet.service: Consumed 87ms CPU time, 97.9M memory peak. May 14 23:40:15.319347 containerd[1574]: time="2025-05-14T23:40:15.318823053Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:15.328076 containerd[1574]: time="2025-05-14T23:40:15.327817701Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=24713776" May 14 23:40:15.337103 containerd[1574]: time="2025-05-14T23:40:15.337061787Z" level=info msg="ImageCreate event name:\"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:15.339897 containerd[1574]: time="2025-05-14T23:40:15.339852909Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:15.340625 containerd[1574]: time="2025-05-14T23:40:15.340509705Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"26202149\" in 2.698668045s" May 14 23:40:15.340625 containerd[1574]: time="2025-05-14T23:40:15.340536018Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:fbda0bc3bc4bb93c8b2d8627a9aa8d945c200b51e48c88f9b837dde628fc7c8f\"" May 14 23:40:15.341149 containerd[1574]: time="2025-05-14T23:40:15.340959299Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 14 23:40:16.590337 containerd[1574]: time="2025-05-14T23:40:16.590089593Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:16.590766 containerd[1574]: time="2025-05-14T23:40:16.590739751Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=18780386" May 14 23:40:16.591355 containerd[1574]: time="2025-05-14T23:40:16.591152970Z" level=info msg="ImageCreate event name:\"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:16.592765 containerd[1574]: time="2025-05-14T23:40:16.592751255Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:16.593621 containerd[1574]: time="2025-05-14T23:40:16.593598655Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"20268777\" in 1.252616518s" May 14 23:40:16.593657 containerd[1574]: time="2025-05-14T23:40:16.593625365Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:2a9c646db0be37003c2b50605a252f7139145411d9e4e0badd8ae07f56ce5eb8\"" May 14 23:40:16.593930 containerd[1574]: time="2025-05-14T23:40:16.593915892Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 14 23:40:17.529890 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4218065780.mount: Deactivated successfully. May 14 23:40:18.263464 containerd[1574]: time="2025-05-14T23:40:18.263435436Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:18.269453 containerd[1574]: time="2025-05-14T23:40:18.269415840Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=30354625" May 14 23:40:18.275390 containerd[1574]: time="2025-05-14T23:40:18.275224084Z" level=info msg="ImageCreate event name:\"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:18.280487 containerd[1574]: time="2025-05-14T23:40:18.280446286Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:18.281265 containerd[1574]: time="2025-05-14T23:40:18.280933700Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"30353644\" in 1.6869948s" May 14 23:40:18.281265 containerd[1574]: time="2025-05-14T23:40:18.280964533Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:7d73f013cedcf301aef42272c93e4c1174dab1a8eccd96840091ef04b63480f2\"" May 14 23:40:18.281394 containerd[1574]: time="2025-05-14T23:40:18.281344909Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 14 23:40:19.018034 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount706978121.mount: Deactivated successfully. May 14 23:40:19.838015 containerd[1574]: time="2025-05-14T23:40:19.837979976Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:19.842804 containerd[1574]: time="2025-05-14T23:40:19.842777371Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" May 14 23:40:19.855936 containerd[1574]: time="2025-05-14T23:40:19.855923653Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:19.866761 containerd[1574]: time="2025-05-14T23:40:19.866727554Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:19.867370 containerd[1574]: time="2025-05-14T23:40:19.867216556Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.585851448s" May 14 23:40:19.867370 containerd[1574]: time="2025-05-14T23:40:19.867234727Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 14 23:40:19.867592 containerd[1574]: time="2025-05-14T23:40:19.867497355Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 14 23:40:20.780617 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3749461216.mount: Deactivated successfully. May 14 23:40:20.800818 containerd[1574]: time="2025-05-14T23:40:20.800707056Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 23:40:20.802900 containerd[1574]: time="2025-05-14T23:40:20.802860140Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 14 23:40:20.807713 containerd[1574]: time="2025-05-14T23:40:20.807680418Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 23:40:20.812537 containerd[1574]: time="2025-05-14T23:40:20.812521368Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 23:40:20.813164 containerd[1574]: time="2025-05-14T23:40:20.812921984Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 945.407434ms" May 14 23:40:20.813164 containerd[1574]: time="2025-05-14T23:40:20.812945768Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 14 23:40:20.813534 containerd[1574]: time="2025-05-14T23:40:20.813254370Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 14 23:40:21.344709 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3587011435.mount: Deactivated successfully. May 14 23:40:22.715006 update_engine[1554]: I20250514 23:40:22.714514 1554 update_attempter.cc:509] Updating boot flags... May 14 23:40:22.774562 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (2288) May 14 23:40:22.827319 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (2292) May 14 23:40:22.894968 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 14 23:40:22.896338 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:40:23.454043 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:40:23.457177 (kubelet)[2309]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 23:40:23.603778 kubelet[2309]: E0514 23:40:23.603718 2309 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 23:40:23.604923 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 23:40:23.605020 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 23:40:23.605284 systemd[1]: kubelet.service: Consumed 94ms CPU time, 97.4M memory peak. May 14 23:40:23.612356 containerd[1574]: time="2025-05-14T23:40:23.612320279Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:23.613324 containerd[1574]: time="2025-05-14T23:40:23.613253957Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56780013" May 14 23:40:23.613324 containerd[1574]: time="2025-05-14T23:40:23.613290005Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:23.614469 containerd[1574]: time="2025-05-14T23:40:23.614441590Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:23.615112 containerd[1574]: time="2025-05-14T23:40:23.615036719Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 2.801761962s" May 14 23:40:23.615112 containerd[1574]: time="2025-05-14T23:40:23.615052955Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" May 14 23:40:25.798039 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:40:25.798505 systemd[1]: kubelet.service: Consumed 94ms CPU time, 97.4M memory peak. May 14 23:40:25.801365 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:40:25.826440 systemd[1]: Reload requested from client PID 2340 ('systemctl') (unit session-9.scope)... May 14 23:40:25.826539 systemd[1]: Reloading... May 14 23:40:25.892327 zram_generator::config[2386]: No configuration found. May 14 23:40:25.949426 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 14 23:40:25.967272 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 23:40:26.032850 systemd[1]: Reloading finished in 206 ms. May 14 23:40:26.055113 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 14 23:40:26.055252 systemd[1]: kubelet.service: Failed with result 'signal'. May 14 23:40:26.055493 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:40:26.057454 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:40:26.321944 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:40:26.328486 (kubelet)[2452]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 23:40:26.352325 kubelet[2452]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 23:40:26.352325 kubelet[2452]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 14 23:40:26.352325 kubelet[2452]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 23:40:26.352325 kubelet[2452]: I0514 23:40:26.352270 2452 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 23:40:26.544633 kubelet[2452]: I0514 23:40:26.543435 2452 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 14 23:40:26.544633 kubelet[2452]: I0514 23:40:26.543453 2452 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 23:40:26.544633 kubelet[2452]: I0514 23:40:26.543598 2452 server.go:929] "Client rotation is on, will bootstrap in background" May 14 23:40:26.568371 kubelet[2452]: I0514 23:40:26.568029 2452 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 23:40:26.568721 kubelet[2452]: E0514 23:40:26.568709 2452 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.107:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:26.579396 kubelet[2452]: I0514 23:40:26.579353 2452 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 14 23:40:26.584606 kubelet[2452]: I0514 23:40:26.584593 2452 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 23:40:26.585505 kubelet[2452]: I0514 23:40:26.585492 2452 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 14 23:40:26.585593 kubelet[2452]: I0514 23:40:26.585579 2452 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 23:40:26.585686 kubelet[2452]: I0514 23:40:26.585594 2452 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 14 23:40:26.585686 kubelet[2452]: I0514 23:40:26.585686 2452 topology_manager.go:138] "Creating topology manager with none policy" May 14 23:40:26.585771 kubelet[2452]: I0514 23:40:26.585691 2452 container_manager_linux.go:300] "Creating device plugin manager" May 14 23:40:26.585771 kubelet[2452]: I0514 23:40:26.585750 2452 state_mem.go:36] "Initialized new in-memory state store" May 14 23:40:26.587471 kubelet[2452]: I0514 23:40:26.587271 2452 kubelet.go:408] "Attempting to sync node with API server" May 14 23:40:26.587471 kubelet[2452]: I0514 23:40:26.587299 2452 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 23:40:26.587471 kubelet[2452]: I0514 23:40:26.587329 2452 kubelet.go:314] "Adding apiserver pod source" May 14 23:40:26.587471 kubelet[2452]: I0514 23:40:26.587363 2452 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 23:40:26.591038 kubelet[2452]: W0514 23:40:26.590757 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:26.591038 kubelet[2452]: E0514 23:40:26.590786 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:26.591971 kubelet[2452]: W0514 23:40:26.591893 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:26.591971 kubelet[2452]: E0514 23:40:26.591920 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:26.592149 kubelet[2452]: I0514 23:40:26.592064 2452 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 14 23:40:26.593387 kubelet[2452]: I0514 23:40:26.593293 2452 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 23:40:26.594053 kubelet[2452]: W0514 23:40:26.593750 2452 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 14 23:40:26.594964 kubelet[2452]: I0514 23:40:26.594825 2452 server.go:1269] "Started kubelet" May 14 23:40:26.602274 kubelet[2452]: E0514 23:40:26.599880 2452 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.107:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.107:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183f89319b4ab16b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-14 23:40:26.594808171 +0000 UTC m=+0.264197443,LastTimestamp:2025-05-14 23:40:26.594808171 +0000 UTC m=+0.264197443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 14 23:40:26.602274 kubelet[2452]: I0514 23:40:26.601973 2452 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 23:40:26.602274 kubelet[2452]: I0514 23:40:26.602183 2452 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 23:40:26.602759 kubelet[2452]: I0514 23:40:26.602748 2452 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 23:40:26.603603 kubelet[2452]: I0514 23:40:26.603565 2452 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 14 23:40:26.607068 kubelet[2452]: I0514 23:40:26.606646 2452 server.go:460] "Adding debug handlers to kubelet server" May 14 23:40:26.608683 kubelet[2452]: I0514 23:40:26.608357 2452 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 14 23:40:26.609680 kubelet[2452]: I0514 23:40:26.609673 2452 volume_manager.go:289] "Starting Kubelet Volume Manager" May 14 23:40:26.609819 kubelet[2452]: E0514 23:40:26.609808 2452 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 14 23:40:26.612683 kubelet[2452]: I0514 23:40:26.612674 2452 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 14 23:40:26.612769 kubelet[2452]: I0514 23:40:26.612763 2452 reconciler.go:26] "Reconciler: start to sync state" May 14 23:40:26.613407 kubelet[2452]: I0514 23:40:26.613398 2452 factory.go:221] Registration of the systemd container factory successfully May 14 23:40:26.613492 kubelet[2452]: I0514 23:40:26.613482 2452 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 23:40:26.613729 kubelet[2452]: W0514 23:40:26.613712 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:26.613781 kubelet[2452]: E0514 23:40:26.613771 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:26.613845 kubelet[2452]: E0514 23:40:26.613834 2452 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="200ms" May 14 23:40:26.614318 kubelet[2452]: E0514 23:40:26.614301 2452 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 14 23:40:26.618577 kubelet[2452]: I0514 23:40:26.615442 2452 factory.go:221] Registration of the containerd container factory successfully May 14 23:40:26.618577 kubelet[2452]: I0514 23:40:26.616200 2452 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 23:40:26.618577 kubelet[2452]: I0514 23:40:26.616793 2452 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 23:40:26.618577 kubelet[2452]: I0514 23:40:26.616805 2452 status_manager.go:217] "Starting to sync pod status with apiserver" May 14 23:40:26.618577 kubelet[2452]: I0514 23:40:26.616816 2452 kubelet.go:2321] "Starting kubelet main sync loop" May 14 23:40:26.618577 kubelet[2452]: E0514 23:40:26.616835 2452 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 23:40:26.621046 kubelet[2452]: W0514 23:40:26.621020 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:26.621100 kubelet[2452]: E0514 23:40:26.621050 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:26.641183 kubelet[2452]: I0514 23:40:26.641168 2452 cpu_manager.go:214] "Starting CPU manager" policy="none" May 14 23:40:26.641183 kubelet[2452]: I0514 23:40:26.641178 2452 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 14 23:40:26.641245 kubelet[2452]: I0514 23:40:26.641192 2452 state_mem.go:36] "Initialized new in-memory state store" May 14 23:40:26.642179 kubelet[2452]: I0514 23:40:26.642168 2452 policy_none.go:49] "None policy: Start" May 14 23:40:26.642444 kubelet[2452]: I0514 23:40:26.642434 2452 memory_manager.go:170] "Starting memorymanager" policy="None" May 14 23:40:26.642473 kubelet[2452]: I0514 23:40:26.642446 2452 state_mem.go:35] "Initializing new in-memory state store" May 14 23:40:26.649072 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 14 23:40:26.656964 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 14 23:40:26.658764 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 14 23:40:26.666130 kubelet[2452]: I0514 23:40:26.665710 2452 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 23:40:26.666130 kubelet[2452]: I0514 23:40:26.665814 2452 eviction_manager.go:189] "Eviction manager: starting control loop" May 14 23:40:26.666130 kubelet[2452]: I0514 23:40:26.665820 2452 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 23:40:26.666130 kubelet[2452]: I0514 23:40:26.666090 2452 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 23:40:26.666933 kubelet[2452]: E0514 23:40:26.666855 2452 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 14 23:40:26.723254 systemd[1]: Created slice kubepods-burstable-pod80bcf52c3a4bb3c6253c0a1251802585.slice - libcontainer container kubepods-burstable-pod80bcf52c3a4bb3c6253c0a1251802585.slice. May 14 23:40:26.742552 systemd[1]: Created slice kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice - libcontainer container kubepods-burstable-podd4a6b755cb4739fbca401212ebb82b6d.slice. May 14 23:40:26.749988 systemd[1]: Created slice kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice - libcontainer container kubepods-burstable-pod0613557c150e4f35d1f3f822b5f32ff1.slice. May 14 23:40:26.767397 kubelet[2452]: I0514 23:40:26.767375 2452 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 14 23:40:26.767654 kubelet[2452]: E0514 23:40:26.767638 2452 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" May 14 23:40:26.814156 kubelet[2452]: E0514 23:40:26.814127 2452 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="400ms" May 14 23:40:26.914576 kubelet[2452]: I0514 23:40:26.914477 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/80bcf52c3a4bb3c6253c0a1251802585-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"80bcf52c3a4bb3c6253c0a1251802585\") " pod="kube-system/kube-apiserver-localhost" May 14 23:40:26.914880 kubelet[2452]: I0514 23:40:26.914706 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/80bcf52c3a4bb3c6253c0a1251802585-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"80bcf52c3a4bb3c6253c0a1251802585\") " pod="kube-system/kube-apiserver-localhost" May 14 23:40:26.914880 kubelet[2452]: I0514 23:40:26.914727 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:26.914880 kubelet[2452]: I0514 23:40:26.914742 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:26.914880 kubelet[2452]: I0514 23:40:26.914777 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 14 23:40:26.914880 kubelet[2452]: I0514 23:40:26.914796 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/80bcf52c3a4bb3c6253c0a1251802585-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"80bcf52c3a4bb3c6253c0a1251802585\") " pod="kube-system/kube-apiserver-localhost" May 14 23:40:26.915029 kubelet[2452]: I0514 23:40:26.914810 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:26.915029 kubelet[2452]: I0514 23:40:26.914832 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:26.915029 kubelet[2452]: I0514 23:40:26.914848 2452 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:26.969494 kubelet[2452]: I0514 23:40:26.969473 2452 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 14 23:40:26.969837 kubelet[2452]: E0514 23:40:26.969810 2452 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" May 14 23:40:27.041286 containerd[1574]: time="2025-05-14T23:40:27.041255552Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:80bcf52c3a4bb3c6253c0a1251802585,Namespace:kube-system,Attempt:0,}" May 14 23:40:27.058594 containerd[1574]: time="2025-05-14T23:40:27.058519583Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,}" May 14 23:40:27.058594 containerd[1574]: time="2025-05-14T23:40:27.058523759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,}" May 14 23:40:27.174444 containerd[1574]: time="2025-05-14T23:40:27.174032607Z" level=info msg="connecting to shim e098c5213e2d307fdac5542e0b5e959e03a6dad33e65e6dc3a8775b21c343d27" address="unix:///run/containerd/s/db210d0addd5ababf9543b8f95e26fba352e7989bdba37163a70ccad0982009c" namespace=k8s.io protocol=ttrpc version=3 May 14 23:40:27.178197 containerd[1574]: time="2025-05-14T23:40:27.177934452Z" level=info msg="connecting to shim a405503a96410c3476801a3fcfe635e4ad03be967f6b0b73efc666f6629ec60a" address="unix:///run/containerd/s/838d33fc6f0c616a9f794f5b3c25760b3db7b889862f31284881f13f2a24282a" namespace=k8s.io protocol=ttrpc version=3 May 14 23:40:27.178519 containerd[1574]: time="2025-05-14T23:40:27.178503162Z" level=info msg="connecting to shim 492354c23830442f1d358deeabf4e84de5fd55c928e64979d2dd71d3f4da62a6" address="unix:///run/containerd/s/deb657b83b5c5cd540a74845bef9bc6f19f9d1a74bb53e8978a35e710a2d8052" namespace=k8s.io protocol=ttrpc version=3 May 14 23:40:27.215121 kubelet[2452]: E0514 23:40:27.215097 2452 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="800ms" May 14 23:40:27.239447 systemd[1]: Started cri-containerd-492354c23830442f1d358deeabf4e84de5fd55c928e64979d2dd71d3f4da62a6.scope - libcontainer container 492354c23830442f1d358deeabf4e84de5fd55c928e64979d2dd71d3f4da62a6. May 14 23:40:27.243704 systemd[1]: Started cri-containerd-a405503a96410c3476801a3fcfe635e4ad03be967f6b0b73efc666f6629ec60a.scope - libcontainer container a405503a96410c3476801a3fcfe635e4ad03be967f6b0b73efc666f6629ec60a. May 14 23:40:27.244887 systemd[1]: Started cri-containerd-e098c5213e2d307fdac5542e0b5e959e03a6dad33e65e6dc3a8775b21c343d27.scope - libcontainer container e098c5213e2d307fdac5542e0b5e959e03a6dad33e65e6dc3a8775b21c343d27. May 14 23:40:27.288673 containerd[1574]: time="2025-05-14T23:40:27.288488588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:80bcf52c3a4bb3c6253c0a1251802585,Namespace:kube-system,Attempt:0,} returns sandbox id \"e098c5213e2d307fdac5542e0b5e959e03a6dad33e65e6dc3a8775b21c343d27\"" May 14 23:40:27.294951 containerd[1574]: time="2025-05-14T23:40:27.293556284Z" level=info msg="CreateContainer within sandbox \"e098c5213e2d307fdac5542e0b5e959e03a6dad33e65e6dc3a8775b21c343d27\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 14 23:40:27.300949 containerd[1574]: time="2025-05-14T23:40:27.300910372Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:d4a6b755cb4739fbca401212ebb82b6d,Namespace:kube-system,Attempt:0,} returns sandbox id \"a405503a96410c3476801a3fcfe635e4ad03be967f6b0b73efc666f6629ec60a\"" May 14 23:40:27.304558 containerd[1574]: time="2025-05-14T23:40:27.304532260Z" level=info msg="CreateContainer within sandbox \"a405503a96410c3476801a3fcfe635e4ad03be967f6b0b73efc666f6629ec60a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 14 23:40:27.308845 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4040603902.mount: Deactivated successfully. May 14 23:40:27.310768 containerd[1574]: time="2025-05-14T23:40:27.310693277Z" level=info msg="Container 11899a0df2a98f0691c061c43bb721c8c0828d94dcd83a014cc28448c3225509: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:27.313873 containerd[1574]: time="2025-05-14T23:40:27.313726978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:0613557c150e4f35d1f3f822b5f32ff1,Namespace:kube-system,Attempt:0,} returns sandbox id \"492354c23830442f1d358deeabf4e84de5fd55c928e64979d2dd71d3f4da62a6\"" May 14 23:40:27.316394 containerd[1574]: time="2025-05-14T23:40:27.315687712Z" level=info msg="Container 8e838076ada9a91ce0697a2d1066a3338651de6032d9b21c69bc6be434494893: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:27.318111 containerd[1574]: time="2025-05-14T23:40:27.318073272Z" level=info msg="CreateContainer within sandbox \"492354c23830442f1d358deeabf4e84de5fd55c928e64979d2dd71d3f4da62a6\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 14 23:40:27.318554 containerd[1574]: time="2025-05-14T23:40:27.318520309Z" level=info msg="CreateContainer within sandbox \"e098c5213e2d307fdac5542e0b5e959e03a6dad33e65e6dc3a8775b21c343d27\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"11899a0df2a98f0691c061c43bb721c8c0828d94dcd83a014cc28448c3225509\"" May 14 23:40:27.319501 containerd[1574]: time="2025-05-14T23:40:27.319481583Z" level=info msg="StartContainer for \"11899a0df2a98f0691c061c43bb721c8c0828d94dcd83a014cc28448c3225509\"" May 14 23:40:27.322299 containerd[1574]: time="2025-05-14T23:40:27.322033100Z" level=info msg="connecting to shim 11899a0df2a98f0691c061c43bb721c8c0828d94dcd83a014cc28448c3225509" address="unix:///run/containerd/s/db210d0addd5ababf9543b8f95e26fba352e7989bdba37163a70ccad0982009c" protocol=ttrpc version=3 May 14 23:40:27.322461 containerd[1574]: time="2025-05-14T23:40:27.322448555Z" level=info msg="CreateContainer within sandbox \"a405503a96410c3476801a3fcfe635e4ad03be967f6b0b73efc666f6629ec60a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8e838076ada9a91ce0697a2d1066a3338651de6032d9b21c69bc6be434494893\"" May 14 23:40:27.322944 containerd[1574]: time="2025-05-14T23:40:27.322932523Z" level=info msg="StartContainer for \"8e838076ada9a91ce0697a2d1066a3338651de6032d9b21c69bc6be434494893\"" May 14 23:40:27.323660 containerd[1574]: time="2025-05-14T23:40:27.323645170Z" level=info msg="connecting to shim 8e838076ada9a91ce0697a2d1066a3338651de6032d9b21c69bc6be434494893" address="unix:///run/containerd/s/838d33fc6f0c616a9f794f5b3c25760b3db7b889862f31284881f13f2a24282a" protocol=ttrpc version=3 May 14 23:40:27.336433 systemd[1]: Started cri-containerd-11899a0df2a98f0691c061c43bb721c8c0828d94dcd83a014cc28448c3225509.scope - libcontainer container 11899a0df2a98f0691c061c43bb721c8c0828d94dcd83a014cc28448c3225509. May 14 23:40:27.338945 systemd[1]: Started cri-containerd-8e838076ada9a91ce0697a2d1066a3338651de6032d9b21c69bc6be434494893.scope - libcontainer container 8e838076ada9a91ce0697a2d1066a3338651de6032d9b21c69bc6be434494893. May 14 23:40:27.360758 containerd[1574]: time="2025-05-14T23:40:27.360341523Z" level=info msg="Container fc6dfff5e9ab88e39720222614ce3d7cd68ba5e7721eb4271dab6f754fdce975: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:27.371546 kubelet[2452]: I0514 23:40:27.371528 2452 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 14 23:40:27.372086 kubelet[2452]: E0514 23:40:27.372026 2452 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" May 14 23:40:27.410633 containerd[1574]: time="2025-05-14T23:40:27.410607488Z" level=info msg="CreateContainer within sandbox \"492354c23830442f1d358deeabf4e84de5fd55c928e64979d2dd71d3f4da62a6\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fc6dfff5e9ab88e39720222614ce3d7cd68ba5e7721eb4271dab6f754fdce975\"" May 14 23:40:27.411251 containerd[1574]: time="2025-05-14T23:40:27.411225832Z" level=info msg="StartContainer for \"fc6dfff5e9ab88e39720222614ce3d7cd68ba5e7721eb4271dab6f754fdce975\"" May 14 23:40:27.412054 containerd[1574]: time="2025-05-14T23:40:27.411933027Z" level=info msg="StartContainer for \"8e838076ada9a91ce0697a2d1066a3338651de6032d9b21c69bc6be434494893\" returns successfully" May 14 23:40:27.412226 containerd[1574]: time="2025-05-14T23:40:27.412079163Z" level=info msg="StartContainer for \"11899a0df2a98f0691c061c43bb721c8c0828d94dcd83a014cc28448c3225509\" returns successfully" May 14 23:40:27.412870 containerd[1574]: time="2025-05-14T23:40:27.412452233Z" level=info msg="connecting to shim fc6dfff5e9ab88e39720222614ce3d7cd68ba5e7721eb4271dab6f754fdce975" address="unix:///run/containerd/s/deb657b83b5c5cd540a74845bef9bc6f19f9d1a74bb53e8978a35e710a2d8052" protocol=ttrpc version=3 May 14 23:40:27.427950 kubelet[2452]: W0514 23:40:27.427480 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:27.428584 kubelet[2452]: E0514 23:40:27.428475 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:27.431422 systemd[1]: Started cri-containerd-fc6dfff5e9ab88e39720222614ce3d7cd68ba5e7721eb4271dab6f754fdce975.scope - libcontainer container fc6dfff5e9ab88e39720222614ce3d7cd68ba5e7721eb4271dab6f754fdce975. May 14 23:40:27.477192 containerd[1574]: time="2025-05-14T23:40:27.477146522Z" level=info msg="StartContainer for \"fc6dfff5e9ab88e39720222614ce3d7cd68ba5e7721eb4271dab6f754fdce975\" returns successfully" May 14 23:40:27.655548 kubelet[2452]: W0514 23:40:27.655492 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:27.655548 kubelet[2452]: E0514 23:40:27.655531 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:27.766233 kubelet[2452]: W0514 23:40:27.765491 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:27.766233 kubelet[2452]: E0514 23:40:27.765534 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:27.856405 kubelet[2452]: W0514 23:40:27.856344 2452 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused May 14 23:40:27.856405 kubelet[2452]: E0514 23:40:27.856388 2452 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" May 14 23:40:28.015465 kubelet[2452]: E0514 23:40:28.015436 2452 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="1.6s" May 14 23:40:28.173684 kubelet[2452]: I0514 23:40:28.173419 2452 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 14 23:40:28.173684 kubelet[2452]: E0514 23:40:28.173605 2452 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" May 14 23:40:29.620612 kubelet[2452]: E0514 23:40:29.620585 2452 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 14 23:40:29.775606 kubelet[2452]: I0514 23:40:29.775405 2452 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 14 23:40:29.783688 kubelet[2452]: I0514 23:40:29.783666 2452 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 14 23:40:29.783688 kubelet[2452]: E0514 23:40:29.783688 2452 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" May 14 23:40:29.788761 kubelet[2452]: E0514 23:40:29.788733 2452 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 14 23:40:29.889713 kubelet[2452]: E0514 23:40:29.889480 2452 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 14 23:40:29.990266 kubelet[2452]: E0514 23:40:29.990204 2452 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 14 23:40:30.091091 kubelet[2452]: E0514 23:40:30.091042 2452 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" May 14 23:40:30.593470 kubelet[2452]: I0514 23:40:30.592754 2452 apiserver.go:52] "Watching apiserver" May 14 23:40:30.614105 kubelet[2452]: I0514 23:40:30.614081 2452 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 14 23:40:31.439550 systemd[1]: Reload requested from client PID 2719 ('systemctl') (unit session-9.scope)... May 14 23:40:31.439563 systemd[1]: Reloading... May 14 23:40:31.520323 zram_generator::config[2773]: No configuration found. May 14 23:40:31.582131 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") May 14 23:40:31.601927 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 23:40:31.683235 systemd[1]: Reloading finished in 243 ms. May 14 23:40:31.706053 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:40:31.718957 systemd[1]: kubelet.service: Deactivated successfully. May 14 23:40:31.720167 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:40:31.720342 systemd[1]: kubelet.service: Consumed 423ms CPU time, 113.1M memory peak. May 14 23:40:31.725709 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 23:40:33.232494 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 23:40:33.239767 (kubelet)[2831]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 23:40:33.432833 kubelet[2831]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 23:40:33.432833 kubelet[2831]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 14 23:40:33.432833 kubelet[2831]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 23:40:33.432833 kubelet[2831]: I0514 23:40:33.432801 2831 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 23:40:33.443025 kubelet[2831]: I0514 23:40:33.442985 2831 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 14 23:40:33.443025 kubelet[2831]: I0514 23:40:33.443031 2831 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 23:40:33.443659 kubelet[2831]: I0514 23:40:33.443633 2831 server.go:929] "Client rotation is on, will bootstrap in background" May 14 23:40:33.446725 kubelet[2831]: I0514 23:40:33.446402 2831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 14 23:40:33.450147 kubelet[2831]: I0514 23:40:33.449988 2831 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 23:40:33.456049 kubelet[2831]: I0514 23:40:33.456015 2831 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 14 23:40:33.459050 kubelet[2831]: I0514 23:40:33.458775 2831 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 23:40:33.459050 kubelet[2831]: I0514 23:40:33.458881 2831 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 14 23:40:33.461588 kubelet[2831]: I0514 23:40:33.461472 2831 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 23:40:33.461707 kubelet[2831]: I0514 23:40:33.461535 2831 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 14 23:40:33.461707 kubelet[2831]: I0514 23:40:33.461677 2831 topology_manager.go:138] "Creating topology manager with none policy" May 14 23:40:33.461707 kubelet[2831]: I0514 23:40:33.461683 2831 container_manager_linux.go:300] "Creating device plugin manager" May 14 23:40:33.461818 kubelet[2831]: I0514 23:40:33.461713 2831 state_mem.go:36] "Initialized new in-memory state store" May 14 23:40:33.465011 kubelet[2831]: I0514 23:40:33.464135 2831 kubelet.go:408] "Attempting to sync node with API server" May 14 23:40:33.465011 kubelet[2831]: I0514 23:40:33.464619 2831 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 23:40:33.465011 kubelet[2831]: I0514 23:40:33.464641 2831 kubelet.go:314] "Adding apiserver pod source" May 14 23:40:33.465011 kubelet[2831]: I0514 23:40:33.464651 2831 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 23:40:33.469461 kubelet[2831]: I0514 23:40:33.469431 2831 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 14 23:40:33.469773 kubelet[2831]: I0514 23:40:33.469753 2831 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 23:40:33.470504 kubelet[2831]: I0514 23:40:33.470010 2831 server.go:1269] "Started kubelet" May 14 23:40:33.486749 kubelet[2831]: I0514 23:40:33.485466 2831 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 23:40:33.486749 kubelet[2831]: I0514 23:40:33.486066 2831 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 14 23:40:33.486749 kubelet[2831]: I0514 23:40:33.486121 2831 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 14 23:40:33.489621 kubelet[2831]: I0514 23:40:33.489602 2831 server.go:460] "Adding debug handlers to kubelet server" May 14 23:40:33.490509 kubelet[2831]: I0514 23:40:33.490475 2831 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 23:40:33.490692 kubelet[2831]: I0514 23:40:33.490684 2831 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 23:40:33.491375 kubelet[2831]: I0514 23:40:33.491361 2831 volume_manager.go:289] "Starting Kubelet Volume Manager" May 14 23:40:33.493512 kubelet[2831]: I0514 23:40:33.493148 2831 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 14 23:40:33.493512 kubelet[2831]: I0514 23:40:33.493261 2831 reconciler.go:26] "Reconciler: start to sync state" May 14 23:40:33.495233 kubelet[2831]: I0514 23:40:33.495067 2831 factory.go:221] Registration of the systemd container factory successfully May 14 23:40:33.496079 kubelet[2831]: I0514 23:40:33.495985 2831 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 23:40:33.502247 kubelet[2831]: E0514 23:40:33.501739 2831 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 14 23:40:33.502583 kubelet[2831]: I0514 23:40:33.502566 2831 factory.go:221] Registration of the containerd container factory successfully May 14 23:40:33.509264 kubelet[2831]: I0514 23:40:33.509233 2831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 23:40:33.509965 kubelet[2831]: I0514 23:40:33.509952 2831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 23:40:33.510015 kubelet[2831]: I0514 23:40:33.509973 2831 status_manager.go:217] "Starting to sync pod status with apiserver" May 14 23:40:33.510015 kubelet[2831]: I0514 23:40:33.509987 2831 kubelet.go:2321] "Starting kubelet main sync loop" May 14 23:40:33.510056 kubelet[2831]: E0514 23:40:33.510024 2831 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 23:40:33.554129 kubelet[2831]: I0514 23:40:33.554056 2831 cpu_manager.go:214] "Starting CPU manager" policy="none" May 14 23:40:33.554129 kubelet[2831]: I0514 23:40:33.554116 2831 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 14 23:40:33.554274 kubelet[2831]: I0514 23:40:33.554141 2831 state_mem.go:36] "Initialized new in-memory state store" May 14 23:40:33.554294 kubelet[2831]: I0514 23:40:33.554275 2831 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 14 23:40:33.554294 kubelet[2831]: I0514 23:40:33.554283 2831 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 14 23:40:33.554342 kubelet[2831]: I0514 23:40:33.554297 2831 policy_none.go:49] "None policy: Start" May 14 23:40:33.555648 kubelet[2831]: I0514 23:40:33.555629 2831 memory_manager.go:170] "Starting memorymanager" policy="None" May 14 23:40:33.555648 kubelet[2831]: I0514 23:40:33.555649 2831 state_mem.go:35] "Initializing new in-memory state store" May 14 23:40:33.555760 kubelet[2831]: I0514 23:40:33.555749 2831 state_mem.go:75] "Updated machine memory state" May 14 23:40:33.562095 kubelet[2831]: I0514 23:40:33.562062 2831 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 23:40:33.562482 kubelet[2831]: I0514 23:40:33.562467 2831 eviction_manager.go:189] "Eviction manager: starting control loop" May 14 23:40:33.562516 kubelet[2831]: I0514 23:40:33.562480 2831 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 23:40:33.562807 kubelet[2831]: I0514 23:40:33.562793 2831 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 23:40:33.669253 kubelet[2831]: I0514 23:40:33.669238 2831 kubelet_node_status.go:72] "Attempting to register node" node="localhost" May 14 23:40:33.677573 kubelet[2831]: I0514 23:40:33.677543 2831 kubelet_node_status.go:111] "Node was previously registered" node="localhost" May 14 23:40:33.677691 kubelet[2831]: I0514 23:40:33.677602 2831 kubelet_node_status.go:75] "Successfully registered node" node="localhost" May 14 23:40:33.794643 kubelet[2831]: I0514 23:40:33.794452 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:33.794643 kubelet[2831]: I0514 23:40:33.794493 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0613557c150e4f35d1f3f822b5f32ff1-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"0613557c150e4f35d1f3f822b5f32ff1\") " pod="kube-system/kube-scheduler-localhost" May 14 23:40:33.794643 kubelet[2831]: I0514 23:40:33.794510 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/80bcf52c3a4bb3c6253c0a1251802585-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"80bcf52c3a4bb3c6253c0a1251802585\") " pod="kube-system/kube-apiserver-localhost" May 14 23:40:33.794643 kubelet[2831]: I0514 23:40:33.794520 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/80bcf52c3a4bb3c6253c0a1251802585-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"80bcf52c3a4bb3c6253c0a1251802585\") " pod="kube-system/kube-apiserver-localhost" May 14 23:40:33.794643 kubelet[2831]: I0514 23:40:33.794529 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:33.794895 kubelet[2831]: I0514 23:40:33.794538 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:33.794895 kubelet[2831]: I0514 23:40:33.794547 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:33.794895 kubelet[2831]: I0514 23:40:33.794555 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4a6b755cb4739fbca401212ebb82b6d-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"d4a6b755cb4739fbca401212ebb82b6d\") " pod="kube-system/kube-controller-manager-localhost" May 14 23:40:33.794895 kubelet[2831]: I0514 23:40:33.794564 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/80bcf52c3a4bb3c6253c0a1251802585-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"80bcf52c3a4bb3c6253c0a1251802585\") " pod="kube-system/kube-apiserver-localhost" May 14 23:40:34.471068 kubelet[2831]: I0514 23:40:34.471046 2831 apiserver.go:52] "Watching apiserver" May 14 23:40:34.493569 kubelet[2831]: I0514 23:40:34.493543 2831 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 14 23:40:34.592583 kubelet[2831]: I0514 23:40:34.592545 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.592533629 podStartE2EDuration="1.592533629s" podCreationTimestamp="2025-05-14 23:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:40:34.561863057 +0000 UTC m=+1.194664725" watchObservedRunningTime="2025-05-14 23:40:34.592533629 +0000 UTC m=+1.225335290" May 14 23:40:34.620760 kubelet[2831]: I0514 23:40:34.620708 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.62069438 podStartE2EDuration="1.62069438s" podCreationTimestamp="2025-05-14 23:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:40:34.593634458 +0000 UTC m=+1.226436118" watchObservedRunningTime="2025-05-14 23:40:34.62069438 +0000 UTC m=+1.253496040" May 14 23:40:34.678519 kubelet[2831]: I0514 23:40:34.678484 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.678470514 podStartE2EDuration="1.678470514s" podCreationTimestamp="2025-05-14 23:40:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:40:34.623094797 +0000 UTC m=+1.255896464" watchObservedRunningTime="2025-05-14 23:40:34.678470514 +0000 UTC m=+1.311272172" May 14 23:40:36.337669 kubelet[2831]: I0514 23:40:36.337644 2831 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 14 23:40:36.338134 kubelet[2831]: I0514 23:40:36.337930 2831 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 14 23:40:36.338264 containerd[1574]: time="2025-05-14T23:40:36.337820818Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 14 23:40:36.599592 systemd[1]: Created slice kubepods-besteffort-podebac3664_8019_4b4e_afbc_b17834e80f51.slice - libcontainer container kubepods-besteffort-podebac3664_8019_4b4e_afbc_b17834e80f51.slice. May 14 23:40:36.610363 kubelet[2831]: I0514 23:40:36.610293 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bzt\" (UniqueName: \"kubernetes.io/projected/ebac3664-8019-4b4e-afbc-b17834e80f51-kube-api-access-h8bzt\") pod \"kube-proxy-x6nmz\" (UID: \"ebac3664-8019-4b4e-afbc-b17834e80f51\") " pod="kube-system/kube-proxy-x6nmz" May 14 23:40:36.610363 kubelet[2831]: I0514 23:40:36.610337 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ebac3664-8019-4b4e-afbc-b17834e80f51-kube-proxy\") pod \"kube-proxy-x6nmz\" (UID: \"ebac3664-8019-4b4e-afbc-b17834e80f51\") " pod="kube-system/kube-proxy-x6nmz" May 14 23:40:36.610363 kubelet[2831]: I0514 23:40:36.610358 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ebac3664-8019-4b4e-afbc-b17834e80f51-xtables-lock\") pod \"kube-proxy-x6nmz\" (UID: \"ebac3664-8019-4b4e-afbc-b17834e80f51\") " pod="kube-system/kube-proxy-x6nmz" May 14 23:40:36.610527 kubelet[2831]: I0514 23:40:36.610373 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ebac3664-8019-4b4e-afbc-b17834e80f51-lib-modules\") pod \"kube-proxy-x6nmz\" (UID: \"ebac3664-8019-4b4e-afbc-b17834e80f51\") " pod="kube-system/kube-proxy-x6nmz" May 14 23:40:36.742704 kubelet[2831]: E0514 23:40:36.742643 2831 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found May 14 23:40:36.742704 kubelet[2831]: E0514 23:40:36.742664 2831 projected.go:194] Error preparing data for projected volume kube-api-access-h8bzt for pod kube-system/kube-proxy-x6nmz: configmap "kube-root-ca.crt" not found May 14 23:40:36.742822 kubelet[2831]: E0514 23:40:36.742726 2831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ebac3664-8019-4b4e-afbc-b17834e80f51-kube-api-access-h8bzt podName:ebac3664-8019-4b4e-afbc-b17834e80f51 nodeName:}" failed. No retries permitted until 2025-05-14 23:40:37.242702824 +0000 UTC m=+3.875504485 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h8bzt" (UniqueName: "kubernetes.io/projected/ebac3664-8019-4b4e-afbc-b17834e80f51-kube-api-access-h8bzt") pod "kube-proxy-x6nmz" (UID: "ebac3664-8019-4b4e-afbc-b17834e80f51") : configmap "kube-root-ca.crt" not found May 14 23:40:37.408625 systemd[1]: Created slice kubepods-besteffort-pod53e54bf1_b1bc_4926_925b_fd8067decbfb.slice - libcontainer container kubepods-besteffort-pod53e54bf1_b1bc_4926_925b_fd8067decbfb.slice. May 14 23:40:37.415982 kubelet[2831]: I0514 23:40:37.415916 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/53e54bf1-b1bc-4926-925b-fd8067decbfb-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-wm7fm\" (UID: \"53e54bf1-b1bc-4926-925b-fd8067decbfb\") " pod="tigera-operator/tigera-operator-6f6897fdc5-wm7fm" May 14 23:40:37.415982 kubelet[2831]: I0514 23:40:37.415950 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7swx\" (UniqueName: \"kubernetes.io/projected/53e54bf1-b1bc-4926-925b-fd8067decbfb-kube-api-access-l7swx\") pod \"tigera-operator-6f6897fdc5-wm7fm\" (UID: \"53e54bf1-b1bc-4926-925b-fd8067decbfb\") " pod="tigera-operator/tigera-operator-6f6897fdc5-wm7fm" May 14 23:40:37.509918 containerd[1574]: time="2025-05-14T23:40:37.509873237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-x6nmz,Uid:ebac3664-8019-4b4e-afbc-b17834e80f51,Namespace:kube-system,Attempt:0,}" May 14 23:40:37.531505 containerd[1574]: time="2025-05-14T23:40:37.531446098Z" level=info msg="connecting to shim f12be526fcadf3104f83b7e5d9c3c7f2ef6174c783ed54f09463be4c5aaaf151" address="unix:///run/containerd/s/c0afa1a0b26aefb0fd28e3b1efe081944d0474b6e4f4ef868dfe1d9a90944158" namespace=k8s.io protocol=ttrpc version=3 May 14 23:40:37.549405 systemd[1]: Started cri-containerd-f12be526fcadf3104f83b7e5d9c3c7f2ef6174c783ed54f09463be4c5aaaf151.scope - libcontainer container f12be526fcadf3104f83b7e5d9c3c7f2ef6174c783ed54f09463be4c5aaaf151. May 14 23:40:37.565219 containerd[1574]: time="2025-05-14T23:40:37.565193047Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-x6nmz,Uid:ebac3664-8019-4b4e-afbc-b17834e80f51,Namespace:kube-system,Attempt:0,} returns sandbox id \"f12be526fcadf3104f83b7e5d9c3c7f2ef6174c783ed54f09463be4c5aaaf151\"" May 14 23:40:37.567695 containerd[1574]: time="2025-05-14T23:40:37.567671866Z" level=info msg="CreateContainer within sandbox \"f12be526fcadf3104f83b7e5d9c3c7f2ef6174c783ed54f09463be4c5aaaf151\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 14 23:40:37.599246 containerd[1574]: time="2025-05-14T23:40:37.599221453Z" level=info msg="Container f188ce7f1ce188be564bcf292bc72101e289538bef7e65d9f5326432b08e1871: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:37.616805 containerd[1574]: time="2025-05-14T23:40:37.616732043Z" level=info msg="CreateContainer within sandbox \"f12be526fcadf3104f83b7e5d9c3c7f2ef6174c783ed54f09463be4c5aaaf151\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f188ce7f1ce188be564bcf292bc72101e289538bef7e65d9f5326432b08e1871\"" May 14 23:40:37.617334 containerd[1574]: time="2025-05-14T23:40:37.617267849Z" level=info msg="StartContainer for \"f188ce7f1ce188be564bcf292bc72101e289538bef7e65d9f5326432b08e1871\"" May 14 23:40:37.618523 containerd[1574]: time="2025-05-14T23:40:37.618498499Z" level=info msg="connecting to shim f188ce7f1ce188be564bcf292bc72101e289538bef7e65d9f5326432b08e1871" address="unix:///run/containerd/s/c0afa1a0b26aefb0fd28e3b1efe081944d0474b6e4f4ef868dfe1d9a90944158" protocol=ttrpc version=3 May 14 23:40:37.638442 systemd[1]: Started cri-containerd-f188ce7f1ce188be564bcf292bc72101e289538bef7e65d9f5326432b08e1871.scope - libcontainer container f188ce7f1ce188be564bcf292bc72101e289538bef7e65d9f5326432b08e1871. May 14 23:40:37.675328 containerd[1574]: time="2025-05-14T23:40:37.675233332Z" level=info msg="StartContainer for \"f188ce7f1ce188be564bcf292bc72101e289538bef7e65d9f5326432b08e1871\" returns successfully" May 14 23:40:37.712589 containerd[1574]: time="2025-05-14T23:40:37.712558271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-wm7fm,Uid:53e54bf1-b1bc-4926-925b-fd8067decbfb,Namespace:tigera-operator,Attempt:0,}" May 14 23:40:37.727332 containerd[1574]: time="2025-05-14T23:40:37.726469539Z" level=info msg="connecting to shim 7727dce39c8f2256ac13a78938abbe45951128c816a81171eac268645125bbf5" address="unix:///run/containerd/s/2dfe660914408e714a7f4621eb4c6ab03a8dce1a3a5ca40fc6e196669ea0c196" namespace=k8s.io protocol=ttrpc version=3 May 14 23:40:37.747549 systemd[1]: Started cri-containerd-7727dce39c8f2256ac13a78938abbe45951128c816a81171eac268645125bbf5.scope - libcontainer container 7727dce39c8f2256ac13a78938abbe45951128c816a81171eac268645125bbf5. May 14 23:40:37.776228 sudo[1876]: pam_unix(sudo:session): session closed for user root May 14 23:40:37.779148 sshd[1875]: Connection closed by 147.75.109.163 port 38526 May 14 23:40:37.781102 sshd-session[1872]: pam_unix(sshd:session): session closed for user core May 14 23:40:37.783942 systemd-logind[1552]: Session 9 logged out. Waiting for processes to exit. May 14 23:40:37.784688 systemd[1]: sshd@6-139.178.70.107:22-147.75.109.163:38526.service: Deactivated successfully. May 14 23:40:37.786074 systemd[1]: session-9.scope: Deactivated successfully. May 14 23:40:37.787705 systemd[1]: session-9.scope: Consumed 3.053s CPU time, 147.1M memory peak. May 14 23:40:37.790238 systemd-logind[1552]: Removed session 9. May 14 23:40:37.792833 containerd[1574]: time="2025-05-14T23:40:37.792811844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-wm7fm,Uid:53e54bf1-b1bc-4926-925b-fd8067decbfb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7727dce39c8f2256ac13a78938abbe45951128c816a81171eac268645125bbf5\"" May 14 23:40:37.794515 containerd[1574]: time="2025-05-14T23:40:37.794462238Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 14 23:40:38.321370 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1704175049.mount: Deactivated successfully. May 14 23:40:38.667034 kubelet[2831]: I0514 23:40:38.665765 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-x6nmz" podStartSLOduration=2.665744379 podStartE2EDuration="2.665744379s" podCreationTimestamp="2025-05-14 23:40:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:40:38.665390789 +0000 UTC m=+5.298192457" watchObservedRunningTime="2025-05-14 23:40:38.665744379 +0000 UTC m=+5.298546040" May 14 23:40:39.698532 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount49454641.mount: Deactivated successfully. May 14 23:40:40.053533 containerd[1574]: time="2025-05-14T23:40:40.053486785Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:40.054029 containerd[1574]: time="2025-05-14T23:40:40.053936221Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 14 23:40:40.054695 containerd[1574]: time="2025-05-14T23:40:40.054345445Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:40.055602 containerd[1574]: time="2025-05-14T23:40:40.055572052Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:40.056205 containerd[1574]: time="2025-05-14T23:40:40.056087478Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.261571912s" May 14 23:40:40.056205 containerd[1574]: time="2025-05-14T23:40:40.056111492Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 14 23:40:40.061204 containerd[1574]: time="2025-05-14T23:40:40.060604521Z" level=info msg="CreateContainer within sandbox \"7727dce39c8f2256ac13a78938abbe45951128c816a81171eac268645125bbf5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 14 23:40:40.066637 containerd[1574]: time="2025-05-14T23:40:40.066611818Z" level=info msg="Container a8da9d78106f51f2ccfcd47a61d3effdb050003babd31fe8f423352ce4f882a5: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:40.070223 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2980313232.mount: Deactivated successfully. May 14 23:40:40.079004 containerd[1574]: time="2025-05-14T23:40:40.078971981Z" level=info msg="CreateContainer within sandbox \"7727dce39c8f2256ac13a78938abbe45951128c816a81171eac268645125bbf5\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a8da9d78106f51f2ccfcd47a61d3effdb050003babd31fe8f423352ce4f882a5\"" May 14 23:40:40.108791 containerd[1574]: time="2025-05-14T23:40:40.108238171Z" level=info msg="StartContainer for \"a8da9d78106f51f2ccfcd47a61d3effdb050003babd31fe8f423352ce4f882a5\"" May 14 23:40:40.109077 containerd[1574]: time="2025-05-14T23:40:40.109063929Z" level=info msg="connecting to shim a8da9d78106f51f2ccfcd47a61d3effdb050003babd31fe8f423352ce4f882a5" address="unix:///run/containerd/s/2dfe660914408e714a7f4621eb4c6ab03a8dce1a3a5ca40fc6e196669ea0c196" protocol=ttrpc version=3 May 14 23:40:40.129457 systemd[1]: Started cri-containerd-a8da9d78106f51f2ccfcd47a61d3effdb050003babd31fe8f423352ce4f882a5.scope - libcontainer container a8da9d78106f51f2ccfcd47a61d3effdb050003babd31fe8f423352ce4f882a5. May 14 23:40:40.188802 containerd[1574]: time="2025-05-14T23:40:40.188775114Z" level=info msg="StartContainer for \"a8da9d78106f51f2ccfcd47a61d3effdb050003babd31fe8f423352ce4f882a5\" returns successfully" May 14 23:40:40.772300 kubelet[2831]: I0514 23:40:40.772244 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-wm7fm" podStartSLOduration=1.500611153 podStartE2EDuration="3.763643391s" podCreationTimestamp="2025-05-14 23:40:37 +0000 UTC" firstStartedPulling="2025-05-14 23:40:37.793790715 +0000 UTC m=+4.426592374" lastFinishedPulling="2025-05-14 23:40:40.056822953 +0000 UTC m=+6.689624612" observedRunningTime="2025-05-14 23:40:40.757580125 +0000 UTC m=+7.390381794" watchObservedRunningTime="2025-05-14 23:40:40.763643391 +0000 UTC m=+7.396445059" May 14 23:40:43.902229 systemd[1]: Created slice kubepods-besteffort-pod7aca2a56_b3f1_4930_a61c_f402dea317c0.slice - libcontainer container kubepods-besteffort-pod7aca2a56_b3f1_4930_a61c_f402dea317c0.slice. May 14 23:40:43.940344 kubelet[2831]: I0514 23:40:43.940016 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7aca2a56-b3f1-4930-a61c-f402dea317c0-tigera-ca-bundle\") pod \"calico-typha-bb45b457f-hfnhp\" (UID: \"7aca2a56-b3f1-4930-a61c-f402dea317c0\") " pod="calico-system/calico-typha-bb45b457f-hfnhp" May 14 23:40:43.940344 kubelet[2831]: I0514 23:40:43.940053 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7aca2a56-b3f1-4930-a61c-f402dea317c0-typha-certs\") pod \"calico-typha-bb45b457f-hfnhp\" (UID: \"7aca2a56-b3f1-4930-a61c-f402dea317c0\") " pod="calico-system/calico-typha-bb45b457f-hfnhp" May 14 23:40:43.942168 kubelet[2831]: I0514 23:40:43.940428 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmpkf\" (UniqueName: \"kubernetes.io/projected/7aca2a56-b3f1-4930-a61c-f402dea317c0-kube-api-access-dmpkf\") pod \"calico-typha-bb45b457f-hfnhp\" (UID: \"7aca2a56-b3f1-4930-a61c-f402dea317c0\") " pod="calico-system/calico-typha-bb45b457f-hfnhp" May 14 23:40:43.948129 systemd[1]: Created slice kubepods-besteffort-pod025e0036_3e02_4104_8cdd_fedbf0774454.slice - libcontainer container kubepods-besteffort-pod025e0036_3e02_4104_8cdd_fedbf0774454.slice. May 14 23:40:44.040890 kubelet[2831]: I0514 23:40:44.040690 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-flexvol-driver-host\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.040890 kubelet[2831]: I0514 23:40:44.040727 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-xtables-lock\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.040890 kubelet[2831]: I0514 23:40:44.040743 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-run-calico\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.040890 kubelet[2831]: I0514 23:40:44.040759 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-log-dir\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.040890 kubelet[2831]: I0514 23:40:44.040777 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-lib-modules\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.041284 kubelet[2831]: I0514 23:40:44.040792 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-lib-calico\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.041284 kubelet[2831]: I0514 23:40:44.040809 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/025e0036-3e02-4104-8cdd-fedbf0774454-node-certs\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.041284 kubelet[2831]: I0514 23:40:44.040825 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-policysync\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.041284 kubelet[2831]: I0514 23:40:44.040840 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s25m\" (UniqueName: \"kubernetes.io/projected/025e0036-3e02-4104-8cdd-fedbf0774454-kube-api-access-9s25m\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.041284 kubelet[2831]: I0514 23:40:44.040854 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-bin-dir\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.050046 kubelet[2831]: I0514 23:40:44.040867 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-net-dir\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.050046 kubelet[2831]: I0514 23:40:44.042359 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/025e0036-3e02-4104-8cdd-fedbf0774454-tigera-ca-bundle\") pod \"calico-node-79jwn\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " pod="calico-system/calico-node-79jwn" May 14 23:40:44.169374 kubelet[2831]: E0514 23:40:44.169214 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.169374 kubelet[2831]: W0514 23:40:44.169236 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.169374 kubelet[2831]: E0514 23:40:44.169262 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.169476 kubelet[2831]: E0514 23:40:44.169470 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.169498 kubelet[2831]: W0514 23:40:44.169477 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.169498 kubelet[2831]: E0514 23:40:44.169485 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.190384 kubelet[2831]: E0514 23:40:44.190336 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:40:44.217824 kubelet[2831]: E0514 23:40:44.217750 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.217824 kubelet[2831]: W0514 23:40:44.217766 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.217824 kubelet[2831]: E0514 23:40:44.217780 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.217981 kubelet[2831]: E0514 23:40:44.217974 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.218014 kubelet[2831]: W0514 23:40:44.218009 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.218086 kubelet[2831]: E0514 23:40:44.218043 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.218203 kubelet[2831]: E0514 23:40:44.218151 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.218203 kubelet[2831]: W0514 23:40:44.218158 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.218203 kubelet[2831]: E0514 23:40:44.218162 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.218285 kubelet[2831]: E0514 23:40:44.218280 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.218344 kubelet[2831]: W0514 23:40:44.218338 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.218421 kubelet[2831]: E0514 23:40:44.218376 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.218528 kubelet[2831]: E0514 23:40:44.218483 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.218528 kubelet[2831]: W0514 23:40:44.218489 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.218528 kubelet[2831]: E0514 23:40:44.218494 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.218606 kubelet[2831]: E0514 23:40:44.218601 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.218641 kubelet[2831]: W0514 23:40:44.218635 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.218674 kubelet[2831]: E0514 23:40:44.218669 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.218798 kubelet[2831]: E0514 23:40:44.218791 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.218892 kubelet[2831]: W0514 23:40:44.218832 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.218892 kubelet[2831]: E0514 23:40:44.218852 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.218959 kubelet[2831]: E0514 23:40:44.218954 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.218990 kubelet[2831]: W0514 23:40:44.218985 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.219078 kubelet[2831]: E0514 23:40:44.219039 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.219191 kubelet[2831]: E0514 23:40:44.219142 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.219191 kubelet[2831]: W0514 23:40:44.219147 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.219191 kubelet[2831]: E0514 23:40:44.219152 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.219273 kubelet[2831]: E0514 23:40:44.219267 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.219350 kubelet[2831]: W0514 23:40:44.219299 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.219350 kubelet[2831]: E0514 23:40:44.219313 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.219418 kubelet[2831]: E0514 23:40:44.219413 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.219449 kubelet[2831]: W0514 23:40:44.219444 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.219481 kubelet[2831]: E0514 23:40:44.219474 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.219652 kubelet[2831]: E0514 23:40:44.219601 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.219652 kubelet[2831]: W0514 23:40:44.219607 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.219652 kubelet[2831]: E0514 23:40:44.219612 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.219736 kubelet[2831]: E0514 23:40:44.219731 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.219767 kubelet[2831]: W0514 23:40:44.219762 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.219799 kubelet[2831]: E0514 23:40:44.219794 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.219920 kubelet[2831]: E0514 23:40:44.219914 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.219987 kubelet[2831]: W0514 23:40:44.219946 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.219987 kubelet[2831]: E0514 23:40:44.219952 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.220108 kubelet[2831]: E0514 23:40:44.220061 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.220108 kubelet[2831]: W0514 23:40:44.220067 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.220108 kubelet[2831]: E0514 23:40:44.220072 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.220237 kubelet[2831]: E0514 23:40:44.220231 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.220325 kubelet[2831]: W0514 23:40:44.220265 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.220325 kubelet[2831]: E0514 23:40:44.220272 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.220511 kubelet[2831]: E0514 23:40:44.220452 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.220511 kubelet[2831]: W0514 23:40:44.220475 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.220511 kubelet[2831]: E0514 23:40:44.220480 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.220566 containerd[1574]: time="2025-05-14T23:40:44.220488569Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bb45b457f-hfnhp,Uid:7aca2a56-b3f1-4930-a61c-f402dea317c0,Namespace:calico-system,Attempt:0,}" May 14 23:40:44.220841 kubelet[2831]: E0514 23:40:44.220788 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.220841 kubelet[2831]: W0514 23:40:44.220795 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.220841 kubelet[2831]: E0514 23:40:44.220800 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.220928 kubelet[2831]: E0514 23:40:44.220923 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.220959 kubelet[2831]: W0514 23:40:44.220954 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.220991 kubelet[2831]: E0514 23:40:44.220986 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.221136 kubelet[2831]: E0514 23:40:44.221129 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.221268 kubelet[2831]: W0514 23:40:44.221228 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.221268 kubelet[2831]: E0514 23:40:44.221236 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.244286 kubelet[2831]: E0514 23:40:44.244264 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.244374 kubelet[2831]: W0514 23:40:44.244283 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.244374 kubelet[2831]: E0514 23:40:44.244332 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.244468 kubelet[2831]: I0514 23:40:44.244359 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/018bb5c2-b3b5-4cf8-a7ea-a530d2470442-socket-dir\") pod \"csi-node-driver-9zp9w\" (UID: \"018bb5c2-b3b5-4cf8-a7ea-a530d2470442\") " pod="calico-system/csi-node-driver-9zp9w" May 14 23:40:44.244565 kubelet[2831]: E0514 23:40:44.244554 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.244592 kubelet[2831]: W0514 23:40:44.244566 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.244592 kubelet[2831]: E0514 23:40:44.244578 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.244627 kubelet[2831]: I0514 23:40:44.244592 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/018bb5c2-b3b5-4cf8-a7ea-a530d2470442-varrun\") pod \"csi-node-driver-9zp9w\" (UID: \"018bb5c2-b3b5-4cf8-a7ea-a530d2470442\") " pod="calico-system/csi-node-driver-9zp9w" May 14 23:40:44.244748 kubelet[2831]: E0514 23:40:44.244736 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.244748 kubelet[2831]: W0514 23:40:44.244747 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.244797 kubelet[2831]: E0514 23:40:44.244758 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.244797 kubelet[2831]: I0514 23:40:44.244772 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79wqg\" (UniqueName: \"kubernetes.io/projected/018bb5c2-b3b5-4cf8-a7ea-a530d2470442-kube-api-access-79wqg\") pod \"csi-node-driver-9zp9w\" (UID: \"018bb5c2-b3b5-4cf8-a7ea-a530d2470442\") " pod="calico-system/csi-node-driver-9zp9w" May 14 23:40:44.244941 kubelet[2831]: E0514 23:40:44.244931 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.244971 kubelet[2831]: W0514 23:40:44.244942 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.244971 kubelet[2831]: E0514 23:40:44.244954 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.244971 kubelet[2831]: I0514 23:40:44.244968 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/018bb5c2-b3b5-4cf8-a7ea-a530d2470442-registration-dir\") pod \"csi-node-driver-9zp9w\" (UID: \"018bb5c2-b3b5-4cf8-a7ea-a530d2470442\") " pod="calico-system/csi-node-driver-9zp9w" May 14 23:40:44.245138 kubelet[2831]: E0514 23:40:44.245128 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.245138 kubelet[2831]: W0514 23:40:44.245137 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.245184 kubelet[2831]: E0514 23:40:44.245149 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.245203 kubelet[2831]: I0514 23:40:44.245182 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/018bb5c2-b3b5-4cf8-a7ea-a530d2470442-kubelet-dir\") pod \"csi-node-driver-9zp9w\" (UID: \"018bb5c2-b3b5-4cf8-a7ea-a530d2470442\") " pod="calico-system/csi-node-driver-9zp9w" May 14 23:40:44.245362 kubelet[2831]: E0514 23:40:44.245350 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.245394 kubelet[2831]: W0514 23:40:44.245361 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.245394 kubelet[2831]: E0514 23:40:44.245372 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.245549 kubelet[2831]: E0514 23:40:44.245536 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.245549 kubelet[2831]: W0514 23:40:44.245548 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.245592 kubelet[2831]: E0514 23:40:44.245577 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.245738 kubelet[2831]: E0514 23:40:44.245728 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.245768 kubelet[2831]: W0514 23:40:44.245738 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.245768 kubelet[2831]: E0514 23:40:44.245750 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.245892 kubelet[2831]: E0514 23:40:44.245871 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.245917 kubelet[2831]: W0514 23:40:44.245894 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.245968 kubelet[2831]: E0514 23:40:44.245937 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.246047 kubelet[2831]: E0514 23:40:44.246027 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.246077 kubelet[2831]: W0514 23:40:44.246048 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.246126 kubelet[2831]: E0514 23:40:44.246091 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.246186 kubelet[2831]: E0514 23:40:44.246178 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.246209 kubelet[2831]: W0514 23:40:44.246198 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.246247 kubelet[2831]: E0514 23:40:44.246236 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.246354 kubelet[2831]: E0514 23:40:44.246345 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.246406 kubelet[2831]: W0514 23:40:44.246355 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.246406 kubelet[2831]: E0514 23:40:44.246391 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.246505 kubelet[2831]: E0514 23:40:44.246496 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.246530 kubelet[2831]: W0514 23:40:44.246504 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.246530 kubelet[2831]: E0514 23:40:44.246512 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.246660 kubelet[2831]: E0514 23:40:44.246650 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.246689 kubelet[2831]: W0514 23:40:44.246660 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.246689 kubelet[2831]: E0514 23:40:44.246668 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.246811 kubelet[2831]: E0514 23:40:44.246802 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.252492 kubelet[2831]: W0514 23:40:44.246811 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.252492 kubelet[2831]: E0514 23:40:44.246819 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.252805 containerd[1574]: time="2025-05-14T23:40:44.252680712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79jwn,Uid:025e0036-3e02-4104-8cdd-fedbf0774454,Namespace:calico-system,Attempt:0,}" May 14 23:40:44.345976 kubelet[2831]: E0514 23:40:44.345899 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.345976 kubelet[2831]: W0514 23:40:44.345918 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.345976 kubelet[2831]: E0514 23:40:44.345932 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.346244 kubelet[2831]: E0514 23:40:44.346117 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.346244 kubelet[2831]: W0514 23:40:44.346123 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.346244 kubelet[2831]: E0514 23:40:44.346132 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.346433 kubelet[2831]: E0514 23:40:44.346360 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.346433 kubelet[2831]: W0514 23:40:44.346368 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.346433 kubelet[2831]: E0514 23:40:44.346382 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.346583 kubelet[2831]: E0514 23:40:44.346528 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.346583 kubelet[2831]: W0514 23:40:44.346536 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.346699 kubelet[2831]: E0514 23:40:44.346642 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.346948 kubelet[2831]: E0514 23:40:44.346842 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.346948 kubelet[2831]: W0514 23:40:44.346849 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.346948 kubelet[2831]: E0514 23:40:44.346860 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.347058 kubelet[2831]: E0514 23:40:44.347045 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.347058 kubelet[2831]: W0514 23:40:44.347057 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.347124 kubelet[2831]: E0514 23:40:44.347067 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.347230 kubelet[2831]: E0514 23:40:44.347211 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.347230 kubelet[2831]: W0514 23:40:44.347221 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.347380 kubelet[2831]: E0514 23:40:44.347234 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.347455 kubelet[2831]: E0514 23:40:44.347427 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.347455 kubelet[2831]: W0514 23:40:44.347440 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.347568 kubelet[2831]: E0514 23:40:44.347481 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.347568 kubelet[2831]: E0514 23:40:44.347559 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.347568 kubelet[2831]: W0514 23:40:44.347565 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.347701 kubelet[2831]: E0514 23:40:44.347587 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.347701 kubelet[2831]: E0514 23:40:44.347671 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.347701 kubelet[2831]: W0514 23:40:44.347677 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.347701 kubelet[2831]: E0514 23:40:44.347689 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.347934 kubelet[2831]: E0514 23:40:44.347803 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.347934 kubelet[2831]: W0514 23:40:44.347810 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.347934 kubelet[2831]: E0514 23:40:44.347820 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.348037 kubelet[2831]: E0514 23:40:44.347951 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.348037 kubelet[2831]: W0514 23:40:44.347958 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.348037 kubelet[2831]: E0514 23:40:44.347965 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.348115 kubelet[2831]: E0514 23:40:44.348093 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.348115 kubelet[2831]: W0514 23:40:44.348100 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.348115 kubelet[2831]: E0514 23:40:44.348107 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.348532 kubelet[2831]: E0514 23:40:44.348303 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.348532 kubelet[2831]: W0514 23:40:44.348321 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.348532 kubelet[2831]: E0514 23:40:44.348329 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.348532 kubelet[2831]: E0514 23:40:44.348466 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.348532 kubelet[2831]: W0514 23:40:44.348473 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.348532 kubelet[2831]: E0514 23:40:44.348480 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.348812 kubelet[2831]: E0514 23:40:44.348732 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.348812 kubelet[2831]: W0514 23:40:44.348739 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.348812 kubelet[2831]: E0514 23:40:44.348751 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.348924 kubelet[2831]: E0514 23:40:44.348909 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.348924 kubelet[2831]: W0514 23:40:44.348916 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.348993 kubelet[2831]: E0514 23:40:44.348928 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.349079 kubelet[2831]: E0514 23:40:44.349056 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.349079 kubelet[2831]: W0514 23:40:44.349063 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.349206 kubelet[2831]: E0514 23:40:44.349151 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.349358 kubelet[2831]: E0514 23:40:44.349346 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.349358 kubelet[2831]: W0514 23:40:44.349355 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.349467 kubelet[2831]: E0514 23:40:44.349458 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.349740 kubelet[2831]: E0514 23:40:44.349712 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.349740 kubelet[2831]: W0514 23:40:44.349723 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.349740 kubelet[2831]: E0514 23:40:44.349733 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.349889 kubelet[2831]: E0514 23:40:44.349872 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.349955 kubelet[2831]: W0514 23:40:44.349888 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.349955 kubelet[2831]: E0514 23:40:44.349899 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.350090 kubelet[2831]: E0514 23:40:44.350079 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.350090 kubelet[2831]: W0514 23:40:44.350088 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.350169 kubelet[2831]: E0514 23:40:44.350111 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.350254 kubelet[2831]: E0514 23:40:44.350242 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.350254 kubelet[2831]: W0514 23:40:44.350252 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.350336 kubelet[2831]: E0514 23:40:44.350260 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.350446 kubelet[2831]: E0514 23:40:44.350434 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.350446 kubelet[2831]: W0514 23:40:44.350444 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.350512 kubelet[2831]: E0514 23:40:44.350452 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.355700 kubelet[2831]: E0514 23:40:44.355643 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.355700 kubelet[2831]: W0514 23:40:44.355659 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.355700 kubelet[2831]: E0514 23:40:44.355673 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.448477 kubelet[2831]: E0514 23:40:44.448409 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.448477 kubelet[2831]: W0514 23:40:44.448432 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.448477 kubelet[2831]: E0514 23:40:44.448446 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.549490 kubelet[2831]: E0514 23:40:44.549471 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.549490 kubelet[2831]: W0514 23:40:44.549485 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.549490 kubelet[2831]: E0514 23:40:44.549498 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.650456 kubelet[2831]: E0514 23:40:44.650443 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.655532 kubelet[2831]: W0514 23:40:44.650514 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.655532 kubelet[2831]: E0514 23:40:44.650526 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.724788 kubelet[2831]: E0514 23:40:44.724686 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.724788 kubelet[2831]: W0514 23:40:44.724710 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.724788 kubelet[2831]: E0514 23:40:44.724723 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.725084 kubelet[2831]: E0514 23:40:44.725077 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.725170 kubelet[2831]: W0514 23:40:44.725126 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.725170 kubelet[2831]: E0514 23:40:44.725154 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.725421 kubelet[2831]: E0514 23:40:44.725375 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.725421 kubelet[2831]: W0514 23:40:44.725383 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.725421 kubelet[2831]: E0514 23:40:44.725388 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.725644 kubelet[2831]: E0514 23:40:44.725590 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.725644 kubelet[2831]: W0514 23:40:44.725596 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.725644 kubelet[2831]: E0514 23:40:44.725601 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.725849 kubelet[2831]: E0514 23:40:44.725787 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.725849 kubelet[2831]: W0514 23:40:44.725803 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.725849 kubelet[2831]: E0514 23:40:44.725808 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.725996 kubelet[2831]: E0514 23:40:44.725967 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.725996 kubelet[2831]: W0514 23:40:44.725973 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.725996 kubelet[2831]: E0514 23:40:44.725979 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.726163 kubelet[2831]: E0514 23:40:44.726144 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.726241 kubelet[2831]: W0514 23:40:44.726150 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.726241 kubelet[2831]: E0514 23:40:44.726195 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.726354 kubelet[2831]: E0514 23:40:44.726321 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.726354 kubelet[2831]: W0514 23:40:44.726327 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.726354 kubelet[2831]: E0514 23:40:44.726332 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.726521 kubelet[2831]: E0514 23:40:44.726494 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.726521 kubelet[2831]: W0514 23:40:44.726500 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.726521 kubelet[2831]: E0514 23:40:44.726504 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.726745 kubelet[2831]: E0514 23:40:44.726687 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.726745 kubelet[2831]: W0514 23:40:44.726694 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.726745 kubelet[2831]: E0514 23:40:44.726699 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.726868 kubelet[2831]: E0514 23:40:44.726838 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.726868 kubelet[2831]: W0514 23:40:44.726843 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.726868 kubelet[2831]: E0514 23:40:44.726849 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.727074 kubelet[2831]: E0514 23:40:44.727041 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.727074 kubelet[2831]: W0514 23:40:44.727047 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.727074 kubelet[2831]: E0514 23:40:44.727052 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.727255 kubelet[2831]: E0514 23:40:44.727210 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.727255 kubelet[2831]: W0514 23:40:44.727215 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.727255 kubelet[2831]: E0514 23:40:44.727220 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.727443 kubelet[2831]: E0514 23:40:44.727374 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.727443 kubelet[2831]: W0514 23:40:44.727380 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.727443 kubelet[2831]: E0514 23:40:44.727385 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.727566 kubelet[2831]: E0514 23:40:44.727484 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.727566 kubelet[2831]: W0514 23:40:44.727488 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.727566 kubelet[2831]: E0514 23:40:44.727494 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.727780 kubelet[2831]: E0514 23:40:44.727716 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.727780 kubelet[2831]: W0514 23:40:44.727721 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.727780 kubelet[2831]: E0514 23:40:44.727727 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.727898 kubelet[2831]: E0514 23:40:44.727833 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.727898 kubelet[2831]: W0514 23:40:44.727838 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.727898 kubelet[2831]: E0514 23:40:44.727842 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.728073 kubelet[2831]: E0514 23:40:44.728045 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.728073 kubelet[2831]: W0514 23:40:44.728050 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.728073 kubelet[2831]: E0514 23:40:44.728055 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.728294 kubelet[2831]: E0514 23:40:44.728230 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.728294 kubelet[2831]: W0514 23:40:44.728247 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.728294 kubelet[2831]: E0514 23:40:44.728254 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.728466 kubelet[2831]: E0514 23:40:44.728399 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.728466 kubelet[2831]: W0514 23:40:44.728404 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.728466 kubelet[2831]: E0514 23:40:44.728410 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.728610 kubelet[2831]: E0514 23:40:44.728582 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.728610 kubelet[2831]: W0514 23:40:44.728588 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.728610 kubelet[2831]: E0514 23:40:44.728593 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.728817 kubelet[2831]: E0514 23:40:44.728769 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.728817 kubelet[2831]: W0514 23:40:44.728776 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.728817 kubelet[2831]: E0514 23:40:44.728781 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.728966 kubelet[2831]: E0514 23:40:44.728925 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.728966 kubelet[2831]: W0514 23:40:44.728929 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.728966 kubelet[2831]: E0514 23:40:44.728934 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.729194 kubelet[2831]: E0514 23:40:44.729140 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.729194 kubelet[2831]: W0514 23:40:44.729146 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.729194 kubelet[2831]: E0514 23:40:44.729152 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.729366 kubelet[2831]: E0514 23:40:44.729327 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.729366 kubelet[2831]: W0514 23:40:44.729334 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.729366 kubelet[2831]: E0514 23:40:44.729339 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.751610 kubelet[2831]: E0514 23:40:44.751598 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.751719 kubelet[2831]: W0514 23:40:44.751683 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.751719 kubelet[2831]: E0514 23:40:44.751696 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.852820 kubelet[2831]: E0514 23:40:44.852765 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.852820 kubelet[2831]: W0514 23:40:44.852779 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.852820 kubelet[2831]: E0514 23:40:44.852794 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:44.953420 kubelet[2831]: E0514 23:40:44.953392 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:44.953420 kubelet[2831]: W0514 23:40:44.953413 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:44.953420 kubelet[2831]: E0514 23:40:44.953427 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.054115 kubelet[2831]: E0514 23:40:45.054080 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.054115 kubelet[2831]: W0514 23:40:45.054096 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.054115 kubelet[2831]: E0514 23:40:45.054109 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.154775 kubelet[2831]: E0514 23:40:45.154721 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.154775 kubelet[2831]: W0514 23:40:45.154737 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.154775 kubelet[2831]: E0514 23:40:45.154750 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.255614 kubelet[2831]: E0514 23:40:45.255582 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.255614 kubelet[2831]: W0514 23:40:45.255602 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.255614 kubelet[2831]: E0514 23:40:45.255617 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.327049 kubelet[2831]: E0514 23:40:45.326957 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.327049 kubelet[2831]: W0514 23:40:45.326969 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.327049 kubelet[2831]: E0514 23:40:45.326983 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.510856 kubelet[2831]: E0514 23:40:45.510591 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:40:45.531006 containerd[1574]: time="2025-05-14T23:40:45.530972746Z" level=info msg="connecting to shim fec9cfdd3b9d721f6abaf113db903ca2599ad1c126aba7cda34ea854a2d67039" address="unix:///run/containerd/s/0adf802ac9c6216c4a3d3bf903e3a3a1f30bcbe46416b500e2a88eef1f4a0f1f" namespace=k8s.io protocol=ttrpc version=3 May 14 23:40:45.556446 systemd[1]: Started cri-containerd-fec9cfdd3b9d721f6abaf113db903ca2599ad1c126aba7cda34ea854a2d67039.scope - libcontainer container fec9cfdd3b9d721f6abaf113db903ca2599ad1c126aba7cda34ea854a2d67039. May 14 23:40:45.559914 containerd[1574]: time="2025-05-14T23:40:45.559818705Z" level=info msg="connecting to shim 6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e" address="unix:///run/containerd/s/756fa8dc4bb9a8bceeb69d3e8926a084661804761cc5f35bb32829cabc0f16d5" namespace=k8s.io protocol=ttrpc version=3 May 14 23:40:45.585418 systemd[1]: Started cri-containerd-6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e.scope - libcontainer container 6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e. May 14 23:40:45.625757 containerd[1574]: time="2025-05-14T23:40:45.625550350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-79jwn,Uid:025e0036-3e02-4104-8cdd-fedbf0774454,Namespace:calico-system,Attempt:0,} returns sandbox id \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\"" May 14 23:40:45.626899 containerd[1574]: time="2025-05-14T23:40:45.626760124Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 14 23:40:45.664550 containerd[1574]: time="2025-05-14T23:40:45.664522905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-bb45b457f-hfnhp,Uid:7aca2a56-b3f1-4930-a61c-f402dea317c0,Namespace:calico-system,Attempt:0,} returns sandbox id \"fec9cfdd3b9d721f6abaf113db903ca2599ad1c126aba7cda34ea854a2d67039\"" May 14 23:40:45.837829 kubelet[2831]: E0514 23:40:45.837752 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.837829 kubelet[2831]: W0514 23:40:45.837769 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.837829 kubelet[2831]: E0514 23:40:45.837783 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.837960 kubelet[2831]: E0514 23:40:45.837919 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.837960 kubelet[2831]: W0514 23:40:45.837926 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.837960 kubelet[2831]: E0514 23:40:45.837932 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838275 kubelet[2831]: E0514 23:40:45.838028 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838275 kubelet[2831]: W0514 23:40:45.838035 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838275 kubelet[2831]: E0514 23:40:45.838044 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838275 kubelet[2831]: E0514 23:40:45.838141 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838275 kubelet[2831]: W0514 23:40:45.838146 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838275 kubelet[2831]: E0514 23:40:45.838150 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838275 kubelet[2831]: E0514 23:40:45.838264 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838275 kubelet[2831]: W0514 23:40:45.838268 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838275 kubelet[2831]: E0514 23:40:45.838276 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838530 kubelet[2831]: E0514 23:40:45.838417 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838530 kubelet[2831]: W0514 23:40:45.838421 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838530 kubelet[2831]: E0514 23:40:45.838427 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838530 kubelet[2831]: E0514 23:40:45.838516 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838530 kubelet[2831]: W0514 23:40:45.838522 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838530 kubelet[2831]: E0514 23:40:45.838527 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838697 kubelet[2831]: E0514 23:40:45.838613 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838697 kubelet[2831]: W0514 23:40:45.838619 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838697 kubelet[2831]: E0514 23:40:45.838624 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838773 kubelet[2831]: E0514 23:40:45.838719 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838773 kubelet[2831]: W0514 23:40:45.838724 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838773 kubelet[2831]: E0514 23:40:45.838729 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838862 kubelet[2831]: E0514 23:40:45.838813 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838862 kubelet[2831]: W0514 23:40:45.838817 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838862 kubelet[2831]: E0514 23:40:45.838821 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.838957 kubelet[2831]: E0514 23:40:45.838902 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.838957 kubelet[2831]: W0514 23:40:45.838906 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.838957 kubelet[2831]: E0514 23:40:45.838910 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.839072 kubelet[2831]: E0514 23:40:45.839034 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.839072 kubelet[2831]: W0514 23:40:45.839039 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.839072 kubelet[2831]: E0514 23:40:45.839044 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.839150 kubelet[2831]: E0514 23:40:45.839132 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.839150 kubelet[2831]: W0514 23:40:45.839136 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.839150 kubelet[2831]: E0514 23:40:45.839141 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.839238 kubelet[2831]: E0514 23:40:45.839222 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.839238 kubelet[2831]: W0514 23:40:45.839229 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.839238 kubelet[2831]: E0514 23:40:45.839235 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:45.839341 kubelet[2831]: E0514 23:40:45.839335 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:45.839341 kubelet[2831]: W0514 23:40:45.839340 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:45.839388 kubelet[2831]: E0514 23:40:45.839344 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.744887 kubelet[2831]: E0514 23:40:46.744865 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.744887 kubelet[2831]: W0514 23:40:46.744881 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745139 kubelet[2831]: E0514 23:40:46.744896 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745139 kubelet[2831]: E0514 23:40:46.745008 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745139 kubelet[2831]: W0514 23:40:46.745022 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745139 kubelet[2831]: E0514 23:40:46.745029 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745139 kubelet[2831]: E0514 23:40:46.745123 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745139 kubelet[2831]: W0514 23:40:46.745128 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745139 kubelet[2831]: E0514 23:40:46.745133 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745258 kubelet[2831]: E0514 23:40:46.745227 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745258 kubelet[2831]: W0514 23:40:46.745231 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745258 kubelet[2831]: E0514 23:40:46.745236 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745354 kubelet[2831]: E0514 23:40:46.745343 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745354 kubelet[2831]: W0514 23:40:46.745351 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745396 kubelet[2831]: E0514 23:40:46.745356 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745454 kubelet[2831]: E0514 23:40:46.745444 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745454 kubelet[2831]: W0514 23:40:46.745452 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745492 kubelet[2831]: E0514 23:40:46.745456 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745556 kubelet[2831]: E0514 23:40:46.745546 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745556 kubelet[2831]: W0514 23:40:46.745554 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745595 kubelet[2831]: E0514 23:40:46.745560 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745670 kubelet[2831]: E0514 23:40:46.745659 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745670 kubelet[2831]: W0514 23:40:46.745668 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745706 kubelet[2831]: E0514 23:40:46.745672 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745766 kubelet[2831]: E0514 23:40:46.745756 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745766 kubelet[2831]: W0514 23:40:46.745764 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745814 kubelet[2831]: E0514 23:40:46.745768 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745876 kubelet[2831]: E0514 23:40:46.745861 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745876 kubelet[2831]: W0514 23:40:46.745869 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745876 kubelet[2831]: E0514 23:40:46.745874 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.745974 kubelet[2831]: E0514 23:40:46.745961 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.745974 kubelet[2831]: W0514 23:40:46.745968 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.745974 kubelet[2831]: E0514 23:40:46.745973 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.746071 kubelet[2831]: E0514 23:40:46.746062 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.746071 kubelet[2831]: W0514 23:40:46.746070 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.746110 kubelet[2831]: E0514 23:40:46.746075 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.746185 kubelet[2831]: E0514 23:40:46.746176 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.746185 kubelet[2831]: W0514 23:40:46.746184 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.746270 kubelet[2831]: E0514 23:40:46.746190 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.746300 kubelet[2831]: E0514 23:40:46.746290 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.746300 kubelet[2831]: W0514 23:40:46.746298 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.746348 kubelet[2831]: E0514 23:40:46.746302 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:46.746411 kubelet[2831]: E0514 23:40:46.746402 2831 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 23:40:46.746411 kubelet[2831]: W0514 23:40:46.746409 2831 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 23:40:46.746450 kubelet[2831]: E0514 23:40:46.746413 2831 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 23:40:47.511334 kubelet[2831]: E0514 23:40:47.510754 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:40:47.636206 containerd[1574]: time="2025-05-14T23:40:47.635567931Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:47.640633 containerd[1574]: time="2025-05-14T23:40:47.640537726Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 14 23:40:47.646628 containerd[1574]: time="2025-05-14T23:40:47.646587858Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:47.652300 containerd[1574]: time="2025-05-14T23:40:47.652246094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:47.652967 containerd[1574]: time="2025-05-14T23:40:47.652671913Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.025886947s" May 14 23:40:47.652967 containerd[1574]: time="2025-05-14T23:40:47.652704375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 14 23:40:47.653896 containerd[1574]: time="2025-05-14T23:40:47.653867493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 14 23:40:47.658146 containerd[1574]: time="2025-05-14T23:40:47.658046242Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 14 23:40:47.731282 containerd[1574]: time="2025-05-14T23:40:47.730779457Z" level=info msg="Container b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:47.770383 containerd[1574]: time="2025-05-14T23:40:47.770295613Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\"" May 14 23:40:47.770937 containerd[1574]: time="2025-05-14T23:40:47.770898363Z" level=info msg="StartContainer for \"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\"" May 14 23:40:47.772394 containerd[1574]: time="2025-05-14T23:40:47.772364026Z" level=info msg="connecting to shim b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42" address="unix:///run/containerd/s/756fa8dc4bb9a8bceeb69d3e8926a084661804761cc5f35bb32829cabc0f16d5" protocol=ttrpc version=3 May 14 23:40:47.795555 systemd[1]: Started cri-containerd-b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42.scope - libcontainer container b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42. May 14 23:40:47.837067 systemd[1]: cri-containerd-b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42.scope: Deactivated successfully. May 14 23:40:47.874705 containerd[1574]: time="2025-05-14T23:40:47.874679547Z" level=info msg="StartContainer for \"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\" returns successfully" May 14 23:40:47.925007 containerd[1574]: time="2025-05-14T23:40:47.924966297Z" level=info msg="received exit event container_id:\"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\" id:\"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\" pid:3480 exited_at:{seconds:1747266047 nanos:838605792}" May 14 23:40:47.947491 containerd[1574]: time="2025-05-14T23:40:47.947381766Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\" id:\"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\" pid:3480 exited_at:{seconds:1747266047 nanos:838605792}" May 14 23:40:47.954989 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42-rootfs.mount: Deactivated successfully. May 14 23:40:49.510755 kubelet[2831]: E0514 23:40:49.510541 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:40:51.209170 containerd[1574]: time="2025-05-14T23:40:51.209096987Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:51.209622 containerd[1574]: time="2025-05-14T23:40:51.209473125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 14 23:40:51.209653 containerd[1574]: time="2025-05-14T23:40:51.209619843Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:51.212226 containerd[1574]: time="2025-05-14T23:40:51.211674938Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.557675067s" May 14 23:40:51.212226 containerd[1574]: time="2025-05-14T23:40:51.211697781Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 14 23:40:51.215329 containerd[1574]: time="2025-05-14T23:40:51.213909170Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 14 23:40:51.226956 containerd[1574]: time="2025-05-14T23:40:51.226105849Z" level=info msg="CreateContainer within sandbox \"fec9cfdd3b9d721f6abaf113db903ca2599ad1c126aba7cda34ea854a2d67039\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 14 23:40:51.238066 containerd[1574]: time="2025-05-14T23:40:51.237660009Z" level=info msg="Container 489709362fc109a9f6cd6a48404489c4d0fc85668a29006fffd5ad5b8fb27ea6: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:51.239674 containerd[1574]: time="2025-05-14T23:40:51.238626669Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:51.239243 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2831833711.mount: Deactivated successfully. May 14 23:40:51.244830 containerd[1574]: time="2025-05-14T23:40:51.244803187Z" level=info msg="CreateContainer within sandbox \"fec9cfdd3b9d721f6abaf113db903ca2599ad1c126aba7cda34ea854a2d67039\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"489709362fc109a9f6cd6a48404489c4d0fc85668a29006fffd5ad5b8fb27ea6\"" May 14 23:40:51.245583 containerd[1574]: time="2025-05-14T23:40:51.245559411Z" level=info msg="StartContainer for \"489709362fc109a9f6cd6a48404489c4d0fc85668a29006fffd5ad5b8fb27ea6\"" May 14 23:40:51.246781 containerd[1574]: time="2025-05-14T23:40:51.246640305Z" level=info msg="connecting to shim 489709362fc109a9f6cd6a48404489c4d0fc85668a29006fffd5ad5b8fb27ea6" address="unix:///run/containerd/s/0adf802ac9c6216c4a3d3bf903e3a3a1f30bcbe46416b500e2a88eef1f4a0f1f" protocol=ttrpc version=3 May 14 23:40:51.269481 systemd[1]: Started cri-containerd-489709362fc109a9f6cd6a48404489c4d0fc85668a29006fffd5ad5b8fb27ea6.scope - libcontainer container 489709362fc109a9f6cd6a48404489c4d0fc85668a29006fffd5ad5b8fb27ea6. May 14 23:40:51.308163 containerd[1574]: time="2025-05-14T23:40:51.308120236Z" level=info msg="StartContainer for \"489709362fc109a9f6cd6a48404489c4d0fc85668a29006fffd5ad5b8fb27ea6\" returns successfully" May 14 23:40:51.510479 kubelet[2831]: E0514 23:40:51.510439 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:40:51.681665 kubelet[2831]: I0514 23:40:51.681628 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-bb45b457f-hfnhp" podStartSLOduration=3.133373114 podStartE2EDuration="8.681616538s" podCreationTimestamp="2025-05-14 23:40:43 +0000 UTC" firstStartedPulling="2025-05-14 23:40:45.665225251 +0000 UTC m=+12.298026910" lastFinishedPulling="2025-05-14 23:40:51.213468669 +0000 UTC m=+17.846270334" observedRunningTime="2025-05-14 23:40:51.681065282 +0000 UTC m=+18.313866948" watchObservedRunningTime="2025-05-14 23:40:51.681616538 +0000 UTC m=+18.314418200" May 14 23:40:52.675571 kubelet[2831]: I0514 23:40:52.675553 2831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 23:40:53.511973 kubelet[2831]: E0514 23:40:53.511335 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:40:54.555416 containerd[1574]: time="2025-05-14T23:40:54.555391393Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:54.556343 containerd[1574]: time="2025-05-14T23:40:54.556277570Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 14 23:40:54.556644 containerd[1574]: time="2025-05-14T23:40:54.556627088Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:54.558038 containerd[1574]: time="2025-05-14T23:40:54.557491870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:40:54.558038 containerd[1574]: time="2025-05-14T23:40:54.557968150Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 3.344034967s" May 14 23:40:54.558038 containerd[1574]: time="2025-05-14T23:40:54.557984680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 14 23:40:54.560046 containerd[1574]: time="2025-05-14T23:40:54.559960601Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 14 23:40:54.564354 containerd[1574]: time="2025-05-14T23:40:54.564331876Z" level=info msg="Container 27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2: CDI devices from CRI Config.CDIDevices: []" May 14 23:40:54.585469 containerd[1574]: time="2025-05-14T23:40:54.585447646Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\"" May 14 23:40:54.590501 containerd[1574]: time="2025-05-14T23:40:54.590174061Z" level=info msg="StartContainer for \"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\"" May 14 23:40:54.591141 containerd[1574]: time="2025-05-14T23:40:54.591102485Z" level=info msg="connecting to shim 27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2" address="unix:///run/containerd/s/756fa8dc4bb9a8bceeb69d3e8926a084661804761cc5f35bb32829cabc0f16d5" protocol=ttrpc version=3 May 14 23:40:54.647410 systemd[1]: Started cri-containerd-27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2.scope - libcontainer container 27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2. May 14 23:40:54.713706 containerd[1574]: time="2025-05-14T23:40:54.713687100Z" level=info msg="StartContainer for \"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\" returns successfully" May 14 23:40:55.511197 kubelet[2831]: E0514 23:40:55.510989 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:40:55.887664 systemd[1]: cri-containerd-27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2.scope: Deactivated successfully. May 14 23:40:55.888132 systemd[1]: cri-containerd-27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2.scope: Consumed 297ms CPU time, 150.1M memory peak, 12K read from disk, 154M written to disk. May 14 23:40:55.897271 containerd[1574]: time="2025-05-14T23:40:55.897234736Z" level=info msg="received exit event container_id:\"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\" id:\"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\" pid:3576 exited_at:{seconds:1747266055 nanos:888628236}" May 14 23:40:55.897508 containerd[1574]: time="2025-05-14T23:40:55.897448516Z" level=info msg="TaskExit event in podsandbox handler container_id:\"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\" id:\"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\" pid:3576 exited_at:{seconds:1747266055 nanos:888628236}" May 14 23:40:55.923814 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2-rootfs.mount: Deactivated successfully. May 14 23:40:56.042501 kubelet[2831]: I0514 23:40:56.042411 2831 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 14 23:40:56.259982 systemd[1]: Created slice kubepods-besteffort-pod18a843ce_b18e_4959_8f6e_0f50bd293efe.slice - libcontainer container kubepods-besteffort-pod18a843ce_b18e_4959_8f6e_0f50bd293efe.slice. May 14 23:40:56.260868 systemd[1]: Created slice kubepods-burstable-pod5e86d3e4_1ed5_4ded_9be6_007462b12e77.slice - libcontainer container kubepods-burstable-pod5e86d3e4_1ed5_4ded_9be6_007462b12e77.slice. May 14 23:40:56.265515 systemd[1]: Created slice kubepods-besteffort-pod26bdfeca_0687_46a9_90a0_450c95fd195f.slice - libcontainer container kubepods-besteffort-pod26bdfeca_0687_46a9_90a0_450c95fd195f.slice. May 14 23:40:56.267021 systemd[1]: Created slice kubepods-besteffort-podb16fe825_67e1_4ddb_b1d8_e5a537b879ea.slice - libcontainer container kubepods-besteffort-podb16fe825_67e1_4ddb_b1d8_e5a537b879ea.slice. May 14 23:40:56.269945 systemd[1]: Created slice kubepods-burstable-pod91fe5f31_0539_49b6_83f5_1dfaad928a0a.slice - libcontainer container kubepods-burstable-pod91fe5f31_0539_49b6_83f5_1dfaad928a0a.slice. May 14 23:40:56.419222 kubelet[2831]: I0514 23:40:56.419097 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b16fe825-67e1-4ddb-b1d8-e5a537b879ea-calico-apiserver-certs\") pod \"calico-apiserver-549c9ddc47-qvrrc\" (UID: \"b16fe825-67e1-4ddb-b1d8-e5a537b879ea\") " pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:40:56.419586 kubelet[2831]: I0514 23:40:56.419335 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzkjk\" (UniqueName: \"kubernetes.io/projected/26bdfeca-0687-46a9-90a0-450c95fd195f-kube-api-access-nzkjk\") pod \"calico-kube-controllers-7cd45688fd-h8dqc\" (UID: \"26bdfeca-0687-46a9-90a0-450c95fd195f\") " pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:40:56.419586 kubelet[2831]: I0514 23:40:56.419364 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w7wj\" (UniqueName: \"kubernetes.io/projected/5e86d3e4-1ed5-4ded-9be6-007462b12e77-kube-api-access-5w7wj\") pod \"coredns-6f6b679f8f-w9hfg\" (UID: \"5e86d3e4-1ed5-4ded-9be6-007462b12e77\") " pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:40:56.419586 kubelet[2831]: I0514 23:40:56.419392 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26bdfeca-0687-46a9-90a0-450c95fd195f-tigera-ca-bundle\") pod \"calico-kube-controllers-7cd45688fd-h8dqc\" (UID: \"26bdfeca-0687-46a9-90a0-450c95fd195f\") " pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:40:56.419586 kubelet[2831]: I0514 23:40:56.419411 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scvxm\" (UniqueName: \"kubernetes.io/projected/91fe5f31-0539-49b6-83f5-1dfaad928a0a-kube-api-access-scvxm\") pod \"coredns-6f6b679f8f-rnrcm\" (UID: \"91fe5f31-0539-49b6-83f5-1dfaad928a0a\") " pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:40:56.419586 kubelet[2831]: I0514 23:40:56.419426 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2swmn\" (UniqueName: \"kubernetes.io/projected/b16fe825-67e1-4ddb-b1d8-e5a537b879ea-kube-api-access-2swmn\") pod \"calico-apiserver-549c9ddc47-qvrrc\" (UID: \"b16fe825-67e1-4ddb-b1d8-e5a537b879ea\") " pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:40:56.419762 kubelet[2831]: I0514 23:40:56.419438 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2zqf\" (UniqueName: \"kubernetes.io/projected/18a843ce-b18e-4959-8f6e-0f50bd293efe-kube-api-access-f2zqf\") pod \"calico-apiserver-549c9ddc47-k62bq\" (UID: \"18a843ce-b18e-4959-8f6e-0f50bd293efe\") " pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:40:56.419762 kubelet[2831]: I0514 23:40:56.419453 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e86d3e4-1ed5-4ded-9be6-007462b12e77-config-volume\") pod \"coredns-6f6b679f8f-w9hfg\" (UID: \"5e86d3e4-1ed5-4ded-9be6-007462b12e77\") " pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:40:56.419762 kubelet[2831]: I0514 23:40:56.419483 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/18a843ce-b18e-4959-8f6e-0f50bd293efe-calico-apiserver-certs\") pod \"calico-apiserver-549c9ddc47-k62bq\" (UID: \"18a843ce-b18e-4959-8f6e-0f50bd293efe\") " pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:40:56.419762 kubelet[2831]: I0514 23:40:56.419497 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/91fe5f31-0539-49b6-83f5-1dfaad928a0a-config-volume\") pod \"coredns-6f6b679f8f-rnrcm\" (UID: \"91fe5f31-0539-49b6-83f5-1dfaad928a0a\") " pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:40:56.574457 containerd[1574]: time="2025-05-14T23:40:56.573644897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,}" May 14 23:40:56.582887 containerd[1574]: time="2025-05-14T23:40:56.582866907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,}" May 14 23:40:56.592765 containerd[1574]: time="2025-05-14T23:40:56.592736869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,}" May 14 23:40:56.593483 containerd[1574]: time="2025-05-14T23:40:56.593095293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,}" May 14 23:40:56.593483 containerd[1574]: time="2025-05-14T23:40:56.593202364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,}" May 14 23:40:56.756224 containerd[1574]: time="2025-05-14T23:40:56.755791165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 14 23:40:56.767925 containerd[1574]: time="2025-05-14T23:40:56.767897902Z" level=error msg="Failed to destroy network for sandbox \"5ab6f01510f22b221a6ab1366a9ae95d7dc4b4a181b214dea9475d6262a6e2cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.784592 containerd[1574]: time="2025-05-14T23:40:56.770808835Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ab6f01510f22b221a6ab1366a9ae95d7dc4b4a181b214dea9475d6262a6e2cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.784757 containerd[1574]: time="2025-05-14T23:40:56.775363467Z" level=error msg="Failed to destroy network for sandbox \"e73e03dfb8bc4c330e59d31eac75d8c8022057f35b4844b6571e89f07de2aa72\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.785812 containerd[1574]: time="2025-05-14T23:40:56.784071300Z" level=error msg="Failed to destroy network for sandbox \"8615e0151c0ad616a527d5519487d0738da85892a6c47ffc6a54ccb3f995738f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.787754 containerd[1574]: time="2025-05-14T23:40:56.787692416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8615e0151c0ad616a527d5519487d0738da85892a6c47ffc6a54ccb3f995738f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.790636 containerd[1574]: time="2025-05-14T23:40:56.790439166Z" level=error msg="Failed to destroy network for sandbox \"0e9697e3f8f6fa83bf8550292fa72d6fff68ed9ba1c934d58a6ec5b6c4ace239\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.790985 containerd[1574]: time="2025-05-14T23:40:56.790889359Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9697e3f8f6fa83bf8550292fa72d6fff68ed9ba1c934d58a6ec5b6c4ace239\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.790985 containerd[1574]: time="2025-05-14T23:40:56.786155936Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e73e03dfb8bc4c330e59d31eac75d8c8022057f35b4844b6571e89f07de2aa72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.790985 containerd[1574]: time="2025-05-14T23:40:56.786198935Z" level=error msg="Failed to destroy network for sandbox \"9fe6f896f7988d03a6b4c80997e2d9717f072c1cfcf9e3d9d902edd87efbc7bc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.791399 containerd[1574]: time="2025-05-14T23:40:56.791361727Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fe6f896f7988d03a6b4c80997e2d9717f072c1cfcf9e3d9d902edd87efbc7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.798625 kubelet[2831]: E0514 23:40:56.787391 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ab6f01510f22b221a6ab1366a9ae95d7dc4b4a181b214dea9475d6262a6e2cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.798625 kubelet[2831]: E0514 23:40:56.798577 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8615e0151c0ad616a527d5519487d0738da85892a6c47ffc6a54ccb3f995738f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.802539 kubelet[2831]: E0514 23:40:56.798593 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ab6f01510f22b221a6ab1366a9ae95d7dc4b4a181b214dea9475d6262a6e2cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:40:56.802539 kubelet[2831]: E0514 23:40:56.802072 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ab6f01510f22b221a6ab1366a9ae95d7dc4b4a181b214dea9475d6262a6e2cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:40:56.802539 kubelet[2831]: E0514 23:40:56.802105 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5ab6f01510f22b221a6ab1366a9ae95d7dc4b4a181b214dea9475d6262a6e2cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" podUID="b16fe825-67e1-4ddb-b1d8-e5a537b879ea" May 14 23:40:56.802698 kubelet[2831]: E0514 23:40:56.791564 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fe6f896f7988d03a6b4c80997e2d9717f072c1cfcf9e3d9d902edd87efbc7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.802698 kubelet[2831]: E0514 23:40:56.802149 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fe6f896f7988d03a6b4c80997e2d9717f072c1cfcf9e3d9d902edd87efbc7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:40:56.802698 kubelet[2831]: E0514 23:40:56.802162 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9fe6f896f7988d03a6b4c80997e2d9717f072c1cfcf9e3d9d902edd87efbc7bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:40:56.802755 kubelet[2831]: E0514 23:40:56.802192 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9fe6f896f7988d03a6b4c80997e2d9717f072c1cfcf9e3d9d902edd87efbc7bc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-w9hfg" podUID="5e86d3e4-1ed5-4ded-9be6-007462b12e77" May 14 23:40:56.802755 kubelet[2831]: E0514 23:40:56.802390 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9697e3f8f6fa83bf8550292fa72d6fff68ed9ba1c934d58a6ec5b6c4ace239\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.802755 kubelet[2831]: E0514 23:40:56.802405 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9697e3f8f6fa83bf8550292fa72d6fff68ed9ba1c934d58a6ec5b6c4ace239\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:40:56.802831 kubelet[2831]: E0514 23:40:56.802417 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e9697e3f8f6fa83bf8550292fa72d6fff68ed9ba1c934d58a6ec5b6c4ace239\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:40:56.802831 kubelet[2831]: E0514 23:40:56.802433 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0e9697e3f8f6fa83bf8550292fa72d6fff68ed9ba1c934d58a6ec5b6c4ace239\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" podUID="18a843ce-b18e-4959-8f6e-0f50bd293efe" May 14 23:40:56.802831 kubelet[2831]: E0514 23:40:56.802462 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e73e03dfb8bc4c330e59d31eac75d8c8022057f35b4844b6571e89f07de2aa72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:56.802900 kubelet[2831]: E0514 23:40:56.802472 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e73e03dfb8bc4c330e59d31eac75d8c8022057f35b4844b6571e89f07de2aa72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:40:56.802900 kubelet[2831]: E0514 23:40:56.802480 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e73e03dfb8bc4c330e59d31eac75d8c8022057f35b4844b6571e89f07de2aa72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:40:56.802900 kubelet[2831]: E0514 23:40:56.802491 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e73e03dfb8bc4c330e59d31eac75d8c8022057f35b4844b6571e89f07de2aa72\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rnrcm" podUID="91fe5f31-0539-49b6-83f5-1dfaad928a0a" May 14 23:40:56.803046 kubelet[2831]: E0514 23:40:56.798600 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8615e0151c0ad616a527d5519487d0738da85892a6c47ffc6a54ccb3f995738f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:40:56.803046 kubelet[2831]: E0514 23:40:56.803002 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8615e0151c0ad616a527d5519487d0738da85892a6c47ffc6a54ccb3f995738f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:40:56.803046 kubelet[2831]: E0514 23:40:56.803027 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8615e0151c0ad616a527d5519487d0738da85892a6c47ffc6a54ccb3f995738f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" podUID="26bdfeca-0687-46a9-90a0-450c95fd195f" May 14 23:40:57.514488 systemd[1]: Created slice kubepods-besteffort-pod018bb5c2_b3b5_4cf8_a7ea_a530d2470442.slice - libcontainer container kubepods-besteffort-pod018bb5c2_b3b5_4cf8_a7ea_a530d2470442.slice. May 14 23:40:57.515869 containerd[1574]: time="2025-05-14T23:40:57.515852061Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,}" May 14 23:40:57.550217 containerd[1574]: time="2025-05-14T23:40:57.550173282Z" level=error msg="Failed to destroy network for sandbox \"d4e9cbd2de2127d1bb6235b97be108de147b6e0c3769c6247b3a68fc32516e18\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:57.551397 systemd[1]: run-netns-cni\x2dff8375e4\x2d479a\x2d064d\x2d4640\x2dde25ce7e7898.mount: Deactivated successfully. May 14 23:40:57.552007 containerd[1574]: time="2025-05-14T23:40:57.551988217Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e9cbd2de2127d1bb6235b97be108de147b6e0c3769c6247b3a68fc32516e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:57.552120 kubelet[2831]: E0514 23:40:57.552101 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e9cbd2de2127d1bb6235b97be108de147b6e0c3769c6247b3a68fc32516e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:40:57.552155 kubelet[2831]: E0514 23:40:57.552138 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e9cbd2de2127d1bb6235b97be108de147b6e0c3769c6247b3a68fc32516e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:40:57.552155 kubelet[2831]: E0514 23:40:57.552151 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d4e9cbd2de2127d1bb6235b97be108de147b6e0c3769c6247b3a68fc32516e18\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:40:57.552204 kubelet[2831]: E0514 23:40:57.552178 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d4e9cbd2de2127d1bb6235b97be108de147b6e0c3769c6247b3a68fc32516e18\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:41:00.882290 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1711816031.mount: Deactivated successfully. May 14 23:41:00.995498 containerd[1574]: time="2025-05-14T23:41:00.990374140Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:01.029341 containerd[1574]: time="2025-05-14T23:41:00.990803412Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 14 23:41:01.030210 containerd[1574]: time="2025-05-14T23:41:01.017996416Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:01.030210 containerd[1574]: time="2025-05-14T23:41:01.018597522Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 4.262190674s" May 14 23:41:01.030210 containerd[1574]: time="2025-05-14T23:41:01.029902233Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:01.032535 containerd[1574]: time="2025-05-14T23:41:01.032503402Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 14 23:41:01.063073 containerd[1574]: time="2025-05-14T23:41:01.063043316Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 14 23:41:01.092256 containerd[1574]: time="2025-05-14T23:41:01.092153860Z" level=info msg="Container 7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:01.094582 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4167453772.mount: Deactivated successfully. May 14 23:41:01.289351 containerd[1574]: time="2025-05-14T23:41:01.289271254Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\"" May 14 23:41:01.296194 containerd[1574]: time="2025-05-14T23:41:01.296031095Z" level=info msg="StartContainer for \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\"" May 14 23:41:01.302757 containerd[1574]: time="2025-05-14T23:41:01.302674623Z" level=info msg="connecting to shim 7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a" address="unix:///run/containerd/s/756fa8dc4bb9a8bceeb69d3e8926a084661804761cc5f35bb32829cabc0f16d5" protocol=ttrpc version=3 May 14 23:41:01.383441 systemd[1]: Started cri-containerd-7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a.scope - libcontainer container 7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a. May 14 23:41:01.418469 containerd[1574]: time="2025-05-14T23:41:01.418400480Z" level=info msg="StartContainer for \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" returns successfully" May 14 23:41:01.502709 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 14 23:41:01.504189 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 14 23:41:01.521542 systemd[1]: cri-containerd-7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a.scope: Deactivated successfully. May 14 23:41:01.524628 containerd[1574]: time="2025-05-14T23:41:01.524490578Z" level=info msg="received exit event container_id:\"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" id:\"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" pid:3811 exit_status:1 exited_at:{seconds:1747266061 nanos:524132344}" May 14 23:41:01.524628 containerd[1574]: time="2025-05-14T23:41:01.524571054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" id:\"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" pid:3811 exit_status:1 exited_at:{seconds:1747266061 nanos:524132344}" May 14 23:41:02.017361 kubelet[2831]: I0514 23:41:02.000016 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-79jwn" podStartSLOduration=3.442660639 podStartE2EDuration="18.85663902s" podCreationTimestamp="2025-05-14 23:40:43 +0000 UTC" firstStartedPulling="2025-05-14 23:40:45.62638025 +0000 UTC m=+12.259181908" lastFinishedPulling="2025-05-14 23:41:01.040358632 +0000 UTC m=+27.673160289" observedRunningTime="2025-05-14 23:41:01.849422788 +0000 UTC m=+28.482224457" watchObservedRunningTime="2025-05-14 23:41:01.85663902 +0000 UTC m=+28.489440681" May 14 23:41:02.239721 containerd[1574]: time="2025-05-14T23:41:02.239634117Z" level=error msg="ExecSync for \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 23:41:02.239969 kubelet[2831]: E0514 23:41:02.239821 2831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 23:41:02.240117 containerd[1574]: time="2025-05-14T23:41:02.240088234Z" level=error msg="ExecSync for \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 23:41:02.240201 kubelet[2831]: E0514 23:41:02.240176 2831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 23:41:02.246256 containerd[1574]: time="2025-05-14T23:41:02.246206097Z" level=error msg="ExecSync for \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 23:41:02.246716 kubelet[2831]: E0514 23:41:02.246467 2831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 23:41:02.843928 kubelet[2831]: I0514 23:41:02.843889 2831 scope.go:117] "RemoveContainer" containerID="7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a" May 14 23:41:02.847012 containerd[1574]: time="2025-05-14T23:41:02.846468023Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for container &ContainerMetadata{Name:calico-node,Attempt:1,}" May 14 23:41:02.853501 containerd[1574]: time="2025-05-14T23:41:02.853479979Z" level=info msg="Container 2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:02.855787 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3311738410.mount: Deactivated successfully. May 14 23:41:02.862869 containerd[1574]: time="2025-05-14T23:41:02.862842167Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for &ContainerMetadata{Name:calico-node,Attempt:1,} returns container id \"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\"" May 14 23:41:02.863591 containerd[1574]: time="2025-05-14T23:41:02.863572901Z" level=info msg="StartContainer for \"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\"" May 14 23:41:02.865083 containerd[1574]: time="2025-05-14T23:41:02.865061829Z" level=info msg="connecting to shim 2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64" address="unix:///run/containerd/s/756fa8dc4bb9a8bceeb69d3e8926a084661804761cc5f35bb32829cabc0f16d5" protocol=ttrpc version=3 May 14 23:41:02.880416 systemd[1]: Started cri-containerd-2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64.scope - libcontainer container 2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64. May 14 23:41:02.909083 containerd[1574]: time="2025-05-14T23:41:02.908926209Z" level=info msg="StartContainer for \"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\" returns successfully" May 14 23:41:03.025827 systemd[1]: cri-containerd-2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64.scope: Deactivated successfully. May 14 23:41:03.026500 containerd[1574]: time="2025-05-14T23:41:03.025885594Z" level=info msg="received exit event container_id:\"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\" id:\"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\" pid:3866 exit_status:1 exited_at:{seconds:1747266063 nanos:25692390}" May 14 23:41:03.026500 containerd[1574]: time="2025-05-14T23:41:03.026411779Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\" id:\"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\" pid:3866 exit_status:1 exited_at:{seconds:1747266063 nanos:25692390}" May 14 23:41:03.026008 systemd[1]: cri-containerd-2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64.scope: Consumed 47ms CPU time, 37.2M memory peak, 12.8M read from disk. May 14 23:41:03.042529 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64-rootfs.mount: Deactivated successfully. May 14 23:41:03.845396 kubelet[2831]: I0514 23:41:03.844999 2831 scope.go:117] "RemoveContainer" containerID="7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a" May 14 23:41:03.845396 kubelet[2831]: I0514 23:41:03.845233 2831 scope.go:117] "RemoveContainer" containerID="2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64" May 14 23:41:03.847978 containerd[1574]: time="2025-05-14T23:41:03.847503369Z" level=info msg="RemoveContainer for \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\"" May 14 23:41:03.854039 containerd[1574]: time="2025-05-14T23:41:03.854012205Z" level=info msg="RemoveContainer for \"7fcda7fd4517770b1caa36b4466a204cbf02d8b9e2a1afd764350f22b7dfd48a\" returns successfully" May 14 23:41:03.860908 kubelet[2831]: E0514 23:41:03.860207 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 10s restarting failed container=calico-node pod=calico-node-79jwn_calico-system(025e0036-3e02-4104-8cdd-fedbf0774454)\"" pod="calico-system/calico-node-79jwn" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" May 14 23:41:04.158583 kubelet[2831]: I0514 23:41:04.158451 2831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 23:41:04.850300 kubelet[2831]: I0514 23:41:04.850220 2831 scope.go:117] "RemoveContainer" containerID="2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64" May 14 23:41:04.850300 kubelet[2831]: E0514 23:41:04.850284 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 10s restarting failed container=calico-node pod=calico-node-79jwn_calico-system(025e0036-3e02-4104-8cdd-fedbf0774454)\"" pod="calico-system/calico-node-79jwn" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" May 14 23:41:07.511492 containerd[1574]: time="2025-05-14T23:41:07.511110286Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,}" May 14 23:41:07.511492 containerd[1574]: time="2025-05-14T23:41:07.511297102Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,}" May 14 23:41:07.665615 containerd[1574]: time="2025-05-14T23:41:07.665583587Z" level=error msg="Failed to destroy network for sandbox \"38fe22a97c660301ae5d87634c1fc0b3df28909c6f27ed606c24553bd248053e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:07.667012 systemd[1]: run-netns-cni\x2d2d0ac2ac\x2da3ea\x2d69b5\x2dc378\x2dd63c4192bd8f.mount: Deactivated successfully. May 14 23:41:07.670887 containerd[1574]: time="2025-05-14T23:41:07.670848809Z" level=error msg="Failed to destroy network for sandbox \"6fd568156c206cbc40269532d648bc4cae34225f1c8173e3058b2e8e96f2d35a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:07.671928 systemd[1]: run-netns-cni\x2d2cd2023d\x2df25a\x2d6463\x2d6dc2\x2d1ae443c4f077.mount: Deactivated successfully. May 14 23:41:07.703769 containerd[1574]: time="2025-05-14T23:41:07.703622113Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"38fe22a97c660301ae5d87634c1fc0b3df28909c6f27ed606c24553bd248053e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:07.703972 kubelet[2831]: E0514 23:41:07.703951 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38fe22a97c660301ae5d87634c1fc0b3df28909c6f27ed606c24553bd248053e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:07.704251 kubelet[2831]: E0514 23:41:07.704171 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38fe22a97c660301ae5d87634c1fc0b3df28909c6f27ed606c24553bd248053e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:41:07.704251 kubelet[2831]: E0514 23:41:07.704189 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"38fe22a97c660301ae5d87634c1fc0b3df28909c6f27ed606c24553bd248053e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:41:07.704251 kubelet[2831]: E0514 23:41:07.704226 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"38fe22a97c660301ae5d87634c1fc0b3df28909c6f27ed606c24553bd248053e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rnrcm" podUID="91fe5f31-0539-49b6-83f5-1dfaad928a0a" May 14 23:41:07.711323 containerd[1574]: time="2025-05-14T23:41:07.711248625Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd568156c206cbc40269532d648bc4cae34225f1c8173e3058b2e8e96f2d35a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:07.711402 kubelet[2831]: E0514 23:41:07.711352 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd568156c206cbc40269532d648bc4cae34225f1c8173e3058b2e8e96f2d35a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:07.711402 kubelet[2831]: E0514 23:41:07.711391 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd568156c206cbc40269532d648bc4cae34225f1c8173e3058b2e8e96f2d35a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:07.711463 kubelet[2831]: E0514 23:41:07.711404 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd568156c206cbc40269532d648bc4cae34225f1c8173e3058b2e8e96f2d35a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:07.711463 kubelet[2831]: E0514 23:41:07.711428 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6fd568156c206cbc40269532d648bc4cae34225f1c8173e3058b2e8e96f2d35a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" podUID="26bdfeca-0687-46a9-90a0-450c95fd195f" May 14 23:41:08.511591 containerd[1574]: time="2025-05-14T23:41:08.511510664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,}" May 14 23:41:08.549789 containerd[1574]: time="2025-05-14T23:41:08.549750770Z" level=error msg="Failed to destroy network for sandbox \"67450fcaf8af523fc170433c13da22bd2de7a232c0a557da2f1e5f9bc78a4c60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:08.551323 systemd[1]: run-netns-cni\x2d047b49c1\x2d526c\x2d1972\x2d6583\x2d6f69ad0992a4.mount: Deactivated successfully. May 14 23:41:08.555693 containerd[1574]: time="2025-05-14T23:41:08.555657075Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"67450fcaf8af523fc170433c13da22bd2de7a232c0a557da2f1e5f9bc78a4c60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:08.556155 kubelet[2831]: E0514 23:41:08.555832 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67450fcaf8af523fc170433c13da22bd2de7a232c0a557da2f1e5f9bc78a4c60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:08.556155 kubelet[2831]: E0514 23:41:08.555881 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67450fcaf8af523fc170433c13da22bd2de7a232c0a557da2f1e5f9bc78a4c60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:41:08.556155 kubelet[2831]: E0514 23:41:08.555894 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"67450fcaf8af523fc170433c13da22bd2de7a232c0a557da2f1e5f9bc78a4c60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:41:08.556298 kubelet[2831]: E0514 23:41:08.555924 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"67450fcaf8af523fc170433c13da22bd2de7a232c0a557da2f1e5f9bc78a4c60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:41:09.511200 containerd[1574]: time="2025-05-14T23:41:09.511166788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:09.556886 containerd[1574]: time="2025-05-14T23:41:09.556851384Z" level=error msg="Failed to destroy network for sandbox \"b4f2f73093de8c2d79669666c26c2d6962f08fede586663ba792b3eb6e0f4ac4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:09.558615 systemd[1]: run-netns-cni\x2d8afc000a\x2da1f5\x2dff93\x2d0a01\x2d9413ab083cfa.mount: Deactivated successfully. May 14 23:41:09.559119 containerd[1574]: time="2025-05-14T23:41:09.559088280Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4f2f73093de8c2d79669666c26c2d6962f08fede586663ba792b3eb6e0f4ac4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:09.559490 kubelet[2831]: E0514 23:41:09.559360 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4f2f73093de8c2d79669666c26c2d6962f08fede586663ba792b3eb6e0f4ac4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:09.559490 kubelet[2831]: E0514 23:41:09.559409 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4f2f73093de8c2d79669666c26c2d6962f08fede586663ba792b3eb6e0f4ac4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:41:09.559490 kubelet[2831]: E0514 23:41:09.559425 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b4f2f73093de8c2d79669666c26c2d6962f08fede586663ba792b3eb6e0f4ac4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:41:09.559738 kubelet[2831]: E0514 23:41:09.559463 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b4f2f73093de8c2d79669666c26c2d6962f08fede586663ba792b3eb6e0f4ac4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" podUID="b16fe825-67e1-4ddb-b1d8-e5a537b879ea" May 14 23:41:10.064014 kubelet[2831]: I0514 23:41:10.063987 2831 scope.go:117] "RemoveContainer" containerID="2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64" May 14 23:41:10.074168 kubelet[2831]: E0514 23:41:10.064103 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 10s restarting failed container=calico-node pod=calico-node-79jwn_calico-system(025e0036-3e02-4104-8cdd-fedbf0774454)\"" pod="calico-system/calico-node-79jwn" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" May 14 23:41:10.511526 containerd[1574]: time="2025-05-14T23:41:10.511476458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:10.511686 containerd[1574]: time="2025-05-14T23:41:10.511476472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,}" May 14 23:41:10.623804 containerd[1574]: time="2025-05-14T23:41:10.623773961Z" level=error msg="Failed to destroy network for sandbox \"303bc749180692669f69dfff32d06c251650d0d53b059209c3cbf51cf816b723\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:10.624476 containerd[1574]: time="2025-05-14T23:41:10.623991478Z" level=error msg="Failed to destroy network for sandbox \"feb299ea18517a796683a2ad17a05d6e0298f984ad759f13c20a4c2b264939fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:10.624476 containerd[1574]: time="2025-05-14T23:41:10.624376342Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"303bc749180692669f69dfff32d06c251650d0d53b059209c3cbf51cf816b723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:10.624578 kubelet[2831]: E0514 23:41:10.624536 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"303bc749180692669f69dfff32d06c251650d0d53b059209c3cbf51cf816b723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:10.624742 kubelet[2831]: E0514 23:41:10.624584 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"303bc749180692669f69dfff32d06c251650d0d53b059209c3cbf51cf816b723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:10.624742 kubelet[2831]: E0514 23:41:10.624598 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"303bc749180692669f69dfff32d06c251650d0d53b059209c3cbf51cf816b723\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:10.624742 kubelet[2831]: E0514 23:41:10.624638 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"303bc749180692669f69dfff32d06c251650d0d53b059209c3cbf51cf816b723\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" podUID="18a843ce-b18e-4959-8f6e-0f50bd293efe" May 14 23:41:10.626510 containerd[1574]: time="2025-05-14T23:41:10.626128231Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"feb299ea18517a796683a2ad17a05d6e0298f984ad759f13c20a4c2b264939fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:10.626727 kubelet[2831]: E0514 23:41:10.626276 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feb299ea18517a796683a2ad17a05d6e0298f984ad759f13c20a4c2b264939fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:10.626727 kubelet[2831]: E0514 23:41:10.626390 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feb299ea18517a796683a2ad17a05d6e0298f984ad759f13c20a4c2b264939fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:10.626727 kubelet[2831]: E0514 23:41:10.626407 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"feb299ea18517a796683a2ad17a05d6e0298f984ad759f13c20a4c2b264939fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:10.626845 kubelet[2831]: E0514 23:41:10.626457 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"feb299ea18517a796683a2ad17a05d6e0298f984ad759f13c20a4c2b264939fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-w9hfg" podUID="5e86d3e4-1ed5-4ded-9be6-007462b12e77" May 14 23:41:10.628040 systemd[1]: run-netns-cni\x2dec892f05\x2d77bb\x2dc3f3\x2da141\x2d8ab01b34da44.mount: Deactivated successfully. May 14 23:41:10.628136 systemd[1]: run-netns-cni\x2d42e761fd\x2db910\x2de0d0\x2dfc37\x2d0ef3abf09475.mount: Deactivated successfully. May 14 23:41:15.636743 kubelet[2831]: I0514 23:41:15.636720 2831 scope.go:117] "RemoveContainer" containerID="2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64" May 14 23:41:15.639338 containerd[1574]: time="2025-05-14T23:41:15.639060892Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for container &ContainerMetadata{Name:calico-node,Attempt:2,}" May 14 23:41:15.644515 containerd[1574]: time="2025-05-14T23:41:15.644480356Z" level=info msg="Container 0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:15.651811 containerd[1574]: time="2025-05-14T23:41:15.651779623Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for &ContainerMetadata{Name:calico-node,Attempt:2,} returns container id \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\"" May 14 23:41:15.652303 containerd[1574]: time="2025-05-14T23:41:15.652284252Z" level=info msg="StartContainer for \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\"" May 14 23:41:15.653345 containerd[1574]: time="2025-05-14T23:41:15.653284808Z" level=info msg="connecting to shim 0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" address="unix:///run/containerd/s/756fa8dc4bb9a8bceeb69d3e8926a084661804761cc5f35bb32829cabc0f16d5" protocol=ttrpc version=3 May 14 23:41:15.677436 systemd[1]: Started cri-containerd-0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5.scope - libcontainer container 0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5. May 14 23:41:15.715315 containerd[1574]: time="2025-05-14T23:41:15.715112647Z" level=info msg="StartContainer for \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" returns successfully" May 14 23:41:15.814755 systemd[1]: cri-containerd-0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5.scope: Deactivated successfully. May 14 23:41:15.814934 systemd[1]: cri-containerd-0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5.scope: Consumed 55ms CPU time, 25.4M memory peak, 5.4M read from disk. May 14 23:41:15.815875 containerd[1574]: time="2025-05-14T23:41:15.815800626Z" level=info msg="received exit event container_id:\"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" id:\"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" pid:4098 exit_status:1 exited_at:{seconds:1747266075 nanos:815619585}" May 14 23:41:15.816075 containerd[1574]: time="2025-05-14T23:41:15.816057855Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" id:\"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" pid:4098 exit_status:1 exited_at:{seconds:1747266075 nanos:815619585}" May 14 23:41:15.888991 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5-rootfs.mount: Deactivated successfully. May 14 23:41:15.900926 containerd[1574]: time="2025-05-14T23:41:15.900871531Z" level=error msg="ExecSync for \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" failed" error="rpc error: code = NotFound desc = failed to exec in container: failed to create exec \"963792dc3332706ed8f10052d7e9490aebca499061d439c31bcf49cb27b01dfd\": task 0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5 not found" May 14 23:41:15.901036 kubelet[2831]: E0514 23:41:15.900999 2831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = failed to exec in container: failed to create exec \"963792dc3332706ed8f10052d7e9490aebca499061d439c31bcf49cb27b01dfd\": task 0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5 not found" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 23:41:15.901188 containerd[1574]: time="2025-05-14T23:41:15.901165640Z" level=error msg="ExecSync for \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 23:41:15.901238 kubelet[2831]: E0514 23:41:15.901223 2831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 23:41:15.901439 containerd[1574]: time="2025-05-14T23:41:15.901387342Z" level=error msg="ExecSync for \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" failed" error="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" May 14 23:41:15.901480 kubelet[2831]: E0514 23:41:15.901465 2831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = failed to exec in container: container is in CONTAINER_EXITED state" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" cmd=["/bin/calico-node","-bird-ready","-felix-ready"] May 14 23:41:16.883103 kubelet[2831]: I0514 23:41:16.883083 2831 scope.go:117] "RemoveContainer" containerID="2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64" May 14 23:41:16.885046 containerd[1574]: time="2025-05-14T23:41:16.884950931Z" level=info msg="RemoveContainer for \"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\"" May 14 23:41:16.885880 kubelet[2831]: I0514 23:41:16.885546 2831 scope.go:117] "RemoveContainer" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" May 14 23:41:16.885880 kubelet[2831]: E0514 23:41:16.885638 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 20s restarting failed container=calico-node pod=calico-node-79jwn_calico-system(025e0036-3e02-4104-8cdd-fedbf0774454)\"" pod="calico-system/calico-node-79jwn" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" May 14 23:41:16.890319 containerd[1574]: time="2025-05-14T23:41:16.889298585Z" level=info msg="RemoveContainer for \"2b62d3936a339426c2fe8fdfd694bd8315b3a5fa51d58ba7dec27385c576bd64\" returns successfully" May 14 23:41:17.886753 kubelet[2831]: I0514 23:41:17.886734 2831 scope.go:117] "RemoveContainer" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" May 14 23:41:17.886992 kubelet[2831]: E0514 23:41:17.886816 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 20s restarting failed container=calico-node pod=calico-node-79jwn_calico-system(025e0036-3e02-4104-8cdd-fedbf0774454)\"" pod="calico-system/calico-node-79jwn" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" May 14 23:41:20.511599 containerd[1574]: time="2025-05-14T23:41:20.511527502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,}" May 14 23:41:20.550762 containerd[1574]: time="2025-05-14T23:41:20.550705515Z" level=error msg="Failed to destroy network for sandbox \"36ffbaad149a193bb8d028a78a9832c0c40e61151a62877acbe89ab714dfd81c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:20.552503 containerd[1574]: time="2025-05-14T23:41:20.552430182Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"36ffbaad149a193bb8d028a78a9832c0c40e61151a62877acbe89ab714dfd81c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:20.552724 kubelet[2831]: E0514 23:41:20.552682 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36ffbaad149a193bb8d028a78a9832c0c40e61151a62877acbe89ab714dfd81c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:20.552907 kubelet[2831]: E0514 23:41:20.552744 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36ffbaad149a193bb8d028a78a9832c0c40e61151a62877acbe89ab714dfd81c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:20.552907 kubelet[2831]: E0514 23:41:20.552764 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36ffbaad149a193bb8d028a78a9832c0c40e61151a62877acbe89ab714dfd81c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:20.552907 kubelet[2831]: E0514 23:41:20.552804 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"36ffbaad149a193bb8d028a78a9832c0c40e61151a62877acbe89ab714dfd81c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" podUID="26bdfeca-0687-46a9-90a0-450c95fd195f" May 14 23:41:20.553069 systemd[1]: run-netns-cni\x2d9a581051\x2da766\x2db7b7\x2d6079\x2dc530b18d1c52.mount: Deactivated successfully. May 14 23:41:21.511810 containerd[1574]: time="2025-05-14T23:41:21.511750816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,}" May 14 23:41:21.545893 containerd[1574]: time="2025-05-14T23:41:21.545851625Z" level=error msg="Failed to destroy network for sandbox \"4e962bde66222a2c286daaae254551b28f41ae16ac715fd6149e0ad23e273e1a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:21.546338 containerd[1574]: time="2025-05-14T23:41:21.546296199Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e962bde66222a2c286daaae254551b28f41ae16ac715fd6149e0ad23e273e1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:21.547728 kubelet[2831]: E0514 23:41:21.546487 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e962bde66222a2c286daaae254551b28f41ae16ac715fd6149e0ad23e273e1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:21.547728 kubelet[2831]: E0514 23:41:21.546529 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e962bde66222a2c286daaae254551b28f41ae16ac715fd6149e0ad23e273e1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:21.547728 kubelet[2831]: E0514 23:41:21.546542 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e962bde66222a2c286daaae254551b28f41ae16ac715fd6149e0ad23e273e1a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:21.547605 systemd[1]: run-netns-cni\x2ddd3bd7df\x2d153b\x2d3d8a\x2defee\x2d832381acf858.mount: Deactivated successfully. May 14 23:41:21.547954 kubelet[2831]: E0514 23:41:21.546569 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4e962bde66222a2c286daaae254551b28f41ae16ac715fd6149e0ad23e273e1a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-w9hfg" podUID="5e86d3e4-1ed5-4ded-9be6-007462b12e77" May 14 23:41:22.511648 containerd[1574]: time="2025-05-14T23:41:22.511551454Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:22.511921 containerd[1574]: time="2025-05-14T23:41:22.511901498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,}" May 14 23:41:22.553734 containerd[1574]: time="2025-05-14T23:41:22.553698679Z" level=error msg="Failed to destroy network for sandbox \"2b9a15734948d79edb9248b9420a795b81357539a4772e12c33c7ff9e7179eca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:22.554830 containerd[1574]: time="2025-05-14T23:41:22.554813232Z" level=error msg="Failed to destroy network for sandbox \"e0fea9bb1d1ef4e00cb5c7f8e8ad1613ce1175dd5d1f8db219c84cfc29f4009b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:22.555233 containerd[1574]: time="2025-05-14T23:41:22.554973806Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b9a15734948d79edb9248b9420a795b81357539a4772e12c33c7ff9e7179eca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:22.555288 kubelet[2831]: E0514 23:41:22.555093 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b9a15734948d79edb9248b9420a795b81357539a4772e12c33c7ff9e7179eca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:22.555288 kubelet[2831]: E0514 23:41:22.555129 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b9a15734948d79edb9248b9420a795b81357539a4772e12c33c7ff9e7179eca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:22.555288 kubelet[2831]: E0514 23:41:22.555144 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b9a15734948d79edb9248b9420a795b81357539a4772e12c33c7ff9e7179eca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:22.556152 kubelet[2831]: E0514 23:41:22.555167 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b9a15734948d79edb9248b9420a795b81357539a4772e12c33c7ff9e7179eca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" podUID="18a843ce-b18e-4959-8f6e-0f50bd293efe" May 14 23:41:22.556662 kubelet[2831]: E0514 23:41:22.556527 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0fea9bb1d1ef4e00cb5c7f8e8ad1613ce1175dd5d1f8db219c84cfc29f4009b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:22.556662 kubelet[2831]: E0514 23:41:22.556547 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0fea9bb1d1ef4e00cb5c7f8e8ad1613ce1175dd5d1f8db219c84cfc29f4009b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:41:22.556662 kubelet[2831]: E0514 23:41:22.556558 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0fea9bb1d1ef4e00cb5c7f8e8ad1613ce1175dd5d1f8db219c84cfc29f4009b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:41:22.556751 containerd[1574]: time="2025-05-14T23:41:22.556414081Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0fea9bb1d1ef4e00cb5c7f8e8ad1613ce1175dd5d1f8db219c84cfc29f4009b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:22.556788 kubelet[2831]: E0514 23:41:22.556586 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0fea9bb1d1ef4e00cb5c7f8e8ad1613ce1175dd5d1f8db219c84cfc29f4009b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rnrcm" podUID="91fe5f31-0539-49b6-83f5-1dfaad928a0a" May 14 23:41:22.556921 systemd[1]: run-netns-cni\x2dadca8ea3\x2d8c72\x2d0556\x2daaf6\x2d9d5e654f439f.mount: Deactivated successfully. May 14 23:41:22.559090 systemd[1]: run-netns-cni\x2d57196e88\x2d29f3\x2da6a5\x2d0527\x2d6f42b0c39155.mount: Deactivated successfully. May 14 23:41:23.512348 containerd[1574]: time="2025-05-14T23:41:23.512127274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:23.512348 containerd[1574]: time="2025-05-14T23:41:23.512239218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,}" May 14 23:41:23.599492 containerd[1574]: time="2025-05-14T23:41:23.599385145Z" level=error msg="Failed to destroy network for sandbox \"1cf1219190b5ab1fb729c768af17edc44e395c6b673fef304dd093d5c3f84be5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:23.601249 systemd[1]: run-netns-cni\x2d1436ec21\x2d06e0\x2d2837\x2d5f07\x2d7894101b87f8.mount: Deactivated successfully. May 14 23:41:23.602445 containerd[1574]: time="2025-05-14T23:41:23.602372854Z" level=error msg="Failed to destroy network for sandbox \"e52556b1f640528697f06d72571f3bdf06ea28d5daba84906d014de17c2474f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:23.603515 systemd[1]: run-netns-cni\x2d351468ab\x2dde23\x2dd579\x2d0708\x2d5a58e01a3c70.mount: Deactivated successfully. May 14 23:41:23.606614 containerd[1574]: time="2025-05-14T23:41:23.606592497Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cf1219190b5ab1fb729c768af17edc44e395c6b673fef304dd093d5c3f84be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:23.607140 kubelet[2831]: E0514 23:41:23.606845 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cf1219190b5ab1fb729c768af17edc44e395c6b673fef304dd093d5c3f84be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:23.607140 kubelet[2831]: E0514 23:41:23.606877 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cf1219190b5ab1fb729c768af17edc44e395c6b673fef304dd093d5c3f84be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:41:23.607140 kubelet[2831]: E0514 23:41:23.606891 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1cf1219190b5ab1fb729c768af17edc44e395c6b673fef304dd093d5c3f84be5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:41:23.607346 kubelet[2831]: E0514 23:41:23.606922 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1cf1219190b5ab1fb729c768af17edc44e395c6b673fef304dd093d5c3f84be5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:41:23.612891 containerd[1574]: time="2025-05-14T23:41:23.612866153Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e52556b1f640528697f06d72571f3bdf06ea28d5daba84906d014de17c2474f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:23.613046 kubelet[2831]: E0514 23:41:23.613026 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e52556b1f640528697f06d72571f3bdf06ea28d5daba84906d014de17c2474f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:23.613076 kubelet[2831]: E0514 23:41:23.613057 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e52556b1f640528697f06d72571f3bdf06ea28d5daba84906d014de17c2474f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:41:23.613076 kubelet[2831]: E0514 23:41:23.613069 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e52556b1f640528697f06d72571f3bdf06ea28d5daba84906d014de17c2474f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:41:23.613121 kubelet[2831]: E0514 23:41:23.613092 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e52556b1f640528697f06d72571f3bdf06ea28d5daba84906d014de17c2474f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" podUID="b16fe825-67e1-4ddb-b1d8-e5a537b879ea" May 14 23:41:28.510945 kubelet[2831]: I0514 23:41:28.510791 2831 scope.go:117] "RemoveContainer" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" May 14 23:41:28.511477 kubelet[2831]: E0514 23:41:28.511451 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 20s restarting failed container=calico-node pod=calico-node-79jwn_calico-system(025e0036-3e02-4104-8cdd-fedbf0774454)\"" pod="calico-system/calico-node-79jwn" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" May 14 23:41:33.512331 containerd[1574]: time="2025-05-14T23:41:33.512288175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:33.554075 containerd[1574]: time="2025-05-14T23:41:33.552350716Z" level=error msg="Failed to destroy network for sandbox \"59be749ea1ae092592cc039f9dd3eecdd17000ce60746d27a9cb3f70f328a7db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:33.553582 systemd[1]: run-netns-cni\x2de13541ff\x2dec18\x2d2ada\x2d1185\x2d900aa5ba70d3.mount: Deactivated successfully. May 14 23:41:33.554737 containerd[1574]: time="2025-05-14T23:41:33.554711215Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"59be749ea1ae092592cc039f9dd3eecdd17000ce60746d27a9cb3f70f328a7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:33.557267 kubelet[2831]: E0514 23:41:33.556234 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59be749ea1ae092592cc039f9dd3eecdd17000ce60746d27a9cb3f70f328a7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:33.557267 kubelet[2831]: E0514 23:41:33.556273 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59be749ea1ae092592cc039f9dd3eecdd17000ce60746d27a9cb3f70f328a7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:33.557267 kubelet[2831]: E0514 23:41:33.556286 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59be749ea1ae092592cc039f9dd3eecdd17000ce60746d27a9cb3f70f328a7db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:33.557559 kubelet[2831]: E0514 23:41:33.556324 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"59be749ea1ae092592cc039f9dd3eecdd17000ce60746d27a9cb3f70f328a7db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" podUID="18a843ce-b18e-4959-8f6e-0f50bd293efe" May 14 23:41:34.510658 containerd[1574]: time="2025-05-14T23:41:34.510620475Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,}" May 14 23:41:34.557564 containerd[1574]: time="2025-05-14T23:41:34.557532625Z" level=error msg="Failed to destroy network for sandbox \"26fb08111966ceeeb2741770369bfcd106d2846c54176f81007a379380aa39b4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:34.558889 systemd[1]: run-netns-cni\x2d3d155d06\x2dc12d\x2ddd2a\x2df16c\x2d5fe740dad864.mount: Deactivated successfully. May 14 23:41:34.562717 containerd[1574]: time="2025-05-14T23:41:34.562632722Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"26fb08111966ceeeb2741770369bfcd106d2846c54176f81007a379380aa39b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:34.562879 kubelet[2831]: E0514 23:41:34.562851 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"26fb08111966ceeeb2741770369bfcd106d2846c54176f81007a379380aa39b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:34.563049 kubelet[2831]: E0514 23:41:34.562906 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"26fb08111966ceeeb2741770369bfcd106d2846c54176f81007a379380aa39b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:34.563049 kubelet[2831]: E0514 23:41:34.562926 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"26fb08111966ceeeb2741770369bfcd106d2846c54176f81007a379380aa39b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:34.563049 kubelet[2831]: E0514 23:41:34.562953 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"26fb08111966ceeeb2741770369bfcd106d2846c54176f81007a379380aa39b4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-w9hfg" podUID="5e86d3e4-1ed5-4ded-9be6-007462b12e77" May 14 23:41:35.512229 containerd[1574]: time="2025-05-14T23:41:35.512178876Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,}" May 14 23:41:35.512535 containerd[1574]: time="2025-05-14T23:41:35.512178905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,}" May 14 23:41:35.552178 containerd[1574]: time="2025-05-14T23:41:35.551742644Z" level=error msg="Failed to destroy network for sandbox \"8a55015c14fb93f501bed006e9c5208be5c7c24007173e70b562e5b520874792\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:35.553026 systemd[1]: run-netns-cni\x2dbb09c1bf\x2db35b\x2de47b\x2dccae\x2dc2d72b058e29.mount: Deactivated successfully. May 14 23:41:35.554585 containerd[1574]: time="2025-05-14T23:41:35.554567047Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a55015c14fb93f501bed006e9c5208be5c7c24007173e70b562e5b520874792\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:35.554907 kubelet[2831]: E0514 23:41:35.554802 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a55015c14fb93f501bed006e9c5208be5c7c24007173e70b562e5b520874792\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:35.554907 kubelet[2831]: E0514 23:41:35.554848 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a55015c14fb93f501bed006e9c5208be5c7c24007173e70b562e5b520874792\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:41:35.554907 kubelet[2831]: E0514 23:41:35.554865 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a55015c14fb93f501bed006e9c5208be5c7c24007173e70b562e5b520874792\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9zp9w" May 14 23:41:35.554992 kubelet[2831]: E0514 23:41:35.554891 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9zp9w_calico-system(018bb5c2-b3b5-4cf8-a7ea-a530d2470442)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a55015c14fb93f501bed006e9c5208be5c7c24007173e70b562e5b520874792\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9zp9w" podUID="018bb5c2-b3b5-4cf8-a7ea-a530d2470442" May 14 23:41:35.558757 containerd[1574]: time="2025-05-14T23:41:35.558731676Z" level=error msg="Failed to destroy network for sandbox \"78511b93feef3d7d958cd29ed164044f126d13c960d808ce68cb7f108f6cf4b2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:35.560178 systemd[1]: run-netns-cni\x2ded14aeef\x2d7a26\x2d88d0\x2d9173\x2dddf8fe05148d.mount: Deactivated successfully. May 14 23:41:35.560479 containerd[1574]: time="2025-05-14T23:41:35.560350416Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"78511b93feef3d7d958cd29ed164044f126d13c960d808ce68cb7f108f6cf4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:35.560530 kubelet[2831]: E0514 23:41:35.560475 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78511b93feef3d7d958cd29ed164044f126d13c960d808ce68cb7f108f6cf4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:35.560530 kubelet[2831]: E0514 23:41:35.560510 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78511b93feef3d7d958cd29ed164044f126d13c960d808ce68cb7f108f6cf4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:35.560530 kubelet[2831]: E0514 23:41:35.560521 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78511b93feef3d7d958cd29ed164044f126d13c960d808ce68cb7f108f6cf4b2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:35.560666 kubelet[2831]: E0514 23:41:35.560546 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"78511b93feef3d7d958cd29ed164044f126d13c960d808ce68cb7f108f6cf4b2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" podUID="26bdfeca-0687-46a9-90a0-450c95fd195f" May 14 23:41:36.510928 containerd[1574]: time="2025-05-14T23:41:36.510800044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,}" May 14 23:41:36.510928 containerd[1574]: time="2025-05-14T23:41:36.510830021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:36.553344 containerd[1574]: time="2025-05-14T23:41:36.552354346Z" level=error msg="Failed to destroy network for sandbox \"d0845e9ffb71bfc958b6262111cd774b90a4d67b0fc01b08980f08cd10ecd278\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:36.553344 containerd[1574]: time="2025-05-14T23:41:36.552794468Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0845e9ffb71bfc958b6262111cd774b90a4d67b0fc01b08980f08cd10ecd278\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:36.556568 kubelet[2831]: E0514 23:41:36.552927 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0845e9ffb71bfc958b6262111cd774b90a4d67b0fc01b08980f08cd10ecd278\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:36.556568 kubelet[2831]: E0514 23:41:36.552965 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0845e9ffb71bfc958b6262111cd774b90a4d67b0fc01b08980f08cd10ecd278\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:41:36.556568 kubelet[2831]: E0514 23:41:36.552978 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d0845e9ffb71bfc958b6262111cd774b90a4d67b0fc01b08980f08cd10ecd278\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" May 14 23:41:36.555442 systemd[1]: run-netns-cni\x2d622f1efa\x2dc36e\x2d26f4\x2d7df5\x2d6325d38afaf0.mount: Deactivated successfully. May 14 23:41:36.557024 containerd[1574]: time="2025-05-14T23:41:36.553359504Z" level=error msg="Failed to destroy network for sandbox \"a880d98846e99415b2f0034cc34c192439ff1d9acc574a174e30e4275c13af3b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:36.557024 containerd[1574]: time="2025-05-14T23:41:36.553873637Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a880d98846e99415b2f0034cc34c192439ff1d9acc574a174e30e4275c13af3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:36.557086 kubelet[2831]: E0514 23:41:36.553004 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-qvrrc_calico-apiserver(b16fe825-67e1-4ddb-b1d8-e5a537b879ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d0845e9ffb71bfc958b6262111cd774b90a4d67b0fc01b08980f08cd10ecd278\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" podUID="b16fe825-67e1-4ddb-b1d8-e5a537b879ea" May 14 23:41:36.557086 kubelet[2831]: E0514 23:41:36.554037 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a880d98846e99415b2f0034cc34c192439ff1d9acc574a174e30e4275c13af3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:36.557086 kubelet[2831]: E0514 23:41:36.554075 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a880d98846e99415b2f0034cc34c192439ff1d9acc574a174e30e4275c13af3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:41:36.555510 systemd[1]: run-netns-cni\x2de26326f6\x2d8e1d\x2dc45c\x2d9b47\x2d5fff06003ad9.mount: Deactivated successfully. May 14 23:41:36.557193 kubelet[2831]: E0514 23:41:36.554176 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a880d98846e99415b2f0034cc34c192439ff1d9acc574a174e30e4275c13af3b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-rnrcm" May 14 23:41:36.557193 kubelet[2831]: E0514 23:41:36.554208 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-rnrcm_kube-system(91fe5f31-0539-49b6-83f5-1dfaad928a0a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a880d98846e99415b2f0034cc34c192439ff1d9acc574a174e30e4275c13af3b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-rnrcm" podUID="91fe5f31-0539-49b6-83f5-1dfaad928a0a" May 14 23:41:39.511101 kubelet[2831]: I0514 23:41:39.510490 2831 scope.go:117] "RemoveContainer" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" May 14 23:41:39.511781 containerd[1574]: time="2025-05-14T23:41:39.511765006Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for container &ContainerMetadata{Name:calico-node,Attempt:3,}" May 14 23:41:39.517421 containerd[1574]: time="2025-05-14T23:41:39.517262035Z" level=info msg="Container 439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:39.521187 containerd[1574]: time="2025-05-14T23:41:39.521166600Z" level=info msg="CreateContainer within sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" for &ContainerMetadata{Name:calico-node,Attempt:3,} returns container id \"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\"" May 14 23:41:39.521897 containerd[1574]: time="2025-05-14T23:41:39.521585069Z" level=info msg="StartContainer for \"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\"" May 14 23:41:39.522380 containerd[1574]: time="2025-05-14T23:41:39.522364413Z" level=info msg="connecting to shim 439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e" address="unix:///run/containerd/s/756fa8dc4bb9a8bceeb69d3e8926a084661804761cc5f35bb32829cabc0f16d5" protocol=ttrpc version=3 May 14 23:41:39.541550 systemd[1]: Started cri-containerd-439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e.scope - libcontainer container 439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e. May 14 23:41:39.566451 containerd[1574]: time="2025-05-14T23:41:39.566422577Z" level=info msg="StartContainer for \"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\" returns successfully" May 14 23:41:39.621278 systemd[1]: cri-containerd-439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e.scope: Deactivated successfully. May 14 23:41:39.621481 systemd[1]: cri-containerd-439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e.scope: Consumed 45ms CPU time, 25M memory peak, 2.8M read from disk. May 14 23:41:39.622688 containerd[1574]: time="2025-05-14T23:41:39.622667276Z" level=info msg="received exit event container_id:\"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\" id:\"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\" pid:4517 exit_status:1 exited_at:{seconds:1747266099 nanos:621614304}" May 14 23:41:39.622885 containerd[1574]: time="2025-05-14T23:41:39.622867531Z" level=info msg="TaskExit event in podsandbox handler container_id:\"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\" id:\"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\" pid:4517 exit_status:1 exited_at:{seconds:1747266099 nanos:621614304}" May 14 23:41:39.641767 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e-rootfs.mount: Deactivated successfully. May 14 23:41:39.933267 kubelet[2831]: I0514 23:41:39.932611 2831 scope.go:117] "RemoveContainer" containerID="0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5" May 14 23:41:39.933267 kubelet[2831]: I0514 23:41:39.932868 2831 scope.go:117] "RemoveContainer" containerID="439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e" May 14 23:41:39.933267 kubelet[2831]: E0514 23:41:39.932947 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-node\" with CrashLoopBackOff: \"back-off 40s restarting failed container=calico-node pod=calico-node-79jwn_calico-system(025e0036-3e02-4104-8cdd-fedbf0774454)\"" pod="calico-system/calico-node-79jwn" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" May 14 23:41:39.935657 containerd[1574]: time="2025-05-14T23:41:39.935632814Z" level=info msg="RemoveContainer for \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\"" May 14 23:41:39.939220 containerd[1574]: time="2025-05-14T23:41:39.939197403Z" level=info msg="RemoveContainer for \"0e1cc364f00cbb4f099db3251a307fea49d4e78f66efa98781803464116056d5\" returns successfully" May 14 23:41:43.558828 systemd[1]: Started sshd@7-139.178.70.107:22-147.75.109.163:49092.service - OpenSSH per-connection server daemon (147.75.109.163:49092). May 14 23:41:43.628040 sshd[4550]: Accepted publickey for core from 147.75.109.163 port 49092 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:41:43.630253 sshd-session[4550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:41:43.639260 systemd-logind[1552]: New session 10 of user core. May 14 23:41:43.642463 systemd[1]: Started session-10.scope - Session 10 of User core. May 14 23:41:43.917998 sshd[4552]: Connection closed by 147.75.109.163 port 49092 May 14 23:41:43.918442 sshd-session[4550]: pam_unix(sshd:session): session closed for user core May 14 23:41:43.921208 systemd-logind[1552]: Session 10 logged out. Waiting for processes to exit. May 14 23:41:43.921570 systemd[1]: sshd@7-139.178.70.107:22-147.75.109.163:49092.service: Deactivated successfully. May 14 23:41:43.922924 systemd[1]: session-10.scope: Deactivated successfully. May 14 23:41:43.923689 systemd-logind[1552]: Removed session 10. May 14 23:41:44.401632 containerd[1574]: time="2025-05-14T23:41:44.400870860Z" level=info msg="StopPodSandbox for \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\"" May 14 23:41:44.402220 containerd[1574]: time="2025-05-14T23:41:44.401938211Z" level=info msg="Container to stop \"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 14 23:41:44.402220 containerd[1574]: time="2025-05-14T23:41:44.401955352Z" level=info msg="Container to stop \"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 14 23:41:44.402220 containerd[1574]: time="2025-05-14T23:41:44.401961555Z" level=info msg="Container to stop \"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" May 14 23:41:44.412947 systemd[1]: cri-containerd-6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e.scope: Deactivated successfully. May 14 23:41:44.413536 containerd[1574]: time="2025-05-14T23:41:44.413361969Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" id:\"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" pid:3406 exit_status:137 exited_at:{seconds:1747266104 nanos:412765467}" May 14 23:41:44.442095 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e-rootfs.mount: Deactivated successfully. May 14 23:41:44.451701 containerd[1574]: time="2025-05-14T23:41:44.451562310Z" level=info msg="shim disconnected" id=6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e namespace=k8s.io May 14 23:41:44.451701 containerd[1574]: time="2025-05-14T23:41:44.451583714Z" level=warning msg="cleaning up after shim disconnected" id=6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e namespace=k8s.io May 14 23:41:44.456527 containerd[1574]: time="2025-05-14T23:41:44.451589169Z" level=info msg="cleaning up dead shim" namespace=k8s.io May 14 23:41:44.504852 containerd[1574]: time="2025-05-14T23:41:44.502706485Z" level=info msg="TearDown network for sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" successfully" May 14 23:41:44.504852 containerd[1574]: time="2025-05-14T23:41:44.502726803Z" level=info msg="StopPodSandbox for \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" returns successfully" May 14 23:41:44.504852 containerd[1574]: time="2025-05-14T23:41:44.504247956Z" level=info msg="received exit event sandbox_id:\"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" exit_status:137 exited_at:{seconds:1747266104 nanos:412765467}" May 14 23:41:44.504285 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e-shm.mount: Deactivated successfully. May 14 23:41:44.537554 kubelet[2831]: E0514 23:41:44.537363 2831 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.537554 kubelet[2831]: E0514 23:41:44.537396 2831 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.537554 kubelet[2831]: E0514 23:41:44.537402 2831 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.537554 kubelet[2831]: E0514 23:41:44.537407 2831 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="flexvol-driver" May 14 23:41:44.537554 kubelet[2831]: E0514 23:41:44.537413 2831 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="install-cni" May 14 23:41:44.540902 kubelet[2831]: I0514 23:41:44.540875 2831 memory_manager.go:354] "RemoveStaleState removing state" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.541002 kubelet[2831]: I0514 23:41:44.540920 2831 memory_manager.go:354] "RemoveStaleState removing state" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.541002 kubelet[2831]: I0514 23:41:44.540929 2831 memory_manager.go:354] "RemoveStaleState removing state" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.541002 kubelet[2831]: E0514 23:41:44.540973 2831 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.541145 kubelet[2831]: I0514 23:41:44.540992 2831 memory_manager.go:354] "RemoveStaleState removing state" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" containerName="calico-node" May 14 23:41:44.550694 systemd[1]: Created slice kubepods-besteffort-poddf9f0f41_c17f_41de_914d_86da5486109f.slice - libcontainer container kubepods-besteffort-poddf9f0f41_c17f_41de_914d_86da5486109f.slice. May 14 23:41:44.634257 kubelet[2831]: I0514 23:41:44.634110 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-flexvol-driver-host\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634257 kubelet[2831]: I0514 23:41:44.634141 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-log-dir\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634257 kubelet[2831]: I0514 23:41:44.634159 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-xtables-lock\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634257 kubelet[2831]: I0514 23:41:44.634173 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-lib-modules\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634257 kubelet[2831]: I0514 23:41:44.634198 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-net-dir\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634257 kubelet[2831]: I0514 23:41:44.634215 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s25m\" (UniqueName: \"kubernetes.io/projected/025e0036-3e02-4104-8cdd-fedbf0774454-kube-api-access-9s25m\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634507 kubelet[2831]: I0514 23:41:44.634231 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/025e0036-3e02-4104-8cdd-fedbf0774454-node-certs\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634507 kubelet[2831]: I0514 23:41:44.634247 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-bin-dir\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634507 kubelet[2831]: I0514 23:41:44.634262 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/025e0036-3e02-4104-8cdd-fedbf0774454-tigera-ca-bundle\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634507 kubelet[2831]: I0514 23:41:44.634273 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-policysync\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634507 kubelet[2831]: I0514 23:41:44.634286 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-run-calico\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634507 kubelet[2831]: I0514 23:41:44.634296 2831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-lib-calico\") pod \"025e0036-3e02-4104-8cdd-fedbf0774454\" (UID: \"025e0036-3e02-4104-8cdd-fedbf0774454\") " May 14 23:41:44.634674 kubelet[2831]: I0514 23:41:44.634340 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-policysync\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634674 kubelet[2831]: I0514 23:41:44.634356 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/df9f0f41-c17f-41de-914d-86da5486109f-node-certs\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634674 kubelet[2831]: I0514 23:41:44.634368 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-var-lib-calico\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634674 kubelet[2831]: I0514 23:41:44.634382 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-cni-bin-dir\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634674 kubelet[2831]: I0514 23:41:44.634399 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-cni-net-dir\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634801 kubelet[2831]: I0514 23:41:44.634413 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-var-run-calico\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634801 kubelet[2831]: I0514 23:41:44.634427 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-cni-log-dir\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634801 kubelet[2831]: I0514 23:41:44.634440 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68k6\" (UniqueName: \"kubernetes.io/projected/df9f0f41-c17f-41de-914d-86da5486109f-kube-api-access-d68k6\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634801 kubelet[2831]: I0514 23:41:44.634454 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-lib-modules\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634801 kubelet[2831]: I0514 23:41:44.634467 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df9f0f41-c17f-41de-914d-86da5486109f-tigera-ca-bundle\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634925 kubelet[2831]: I0514 23:41:44.634481 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-xtables-lock\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.634925 kubelet[2831]: I0514 23:41:44.634493 2831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/df9f0f41-c17f-41de-914d-86da5486109f-flexvol-driver-host\") pod \"calico-node-k48fw\" (UID: \"df9f0f41-c17f-41de-914d-86da5486109f\") " pod="calico-system/calico-node-k48fw" May 14 23:41:44.640043 kubelet[2831]: I0514 23:41:44.638524 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.640043 kubelet[2831]: I0514 23:41:44.639735 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.640043 kubelet[2831]: I0514 23:41:44.639757 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.640043 kubelet[2831]: I0514 23:41:44.639776 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.640281 kubelet[2831]: I0514 23:41:44.638522 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.642897 kubelet[2831]: I0514 23:41:44.642862 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025e0036-3e02-4104-8cdd-fedbf0774454-node-certs" (OuterVolumeSpecName: "node-certs") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" May 14 23:41:44.642955 kubelet[2831]: I0514 23:41:44.642904 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.643412 kubelet[2831]: I0514 23:41:44.643372 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025e0036-3e02-4104-8cdd-fedbf0774454-kube-api-access-9s25m" (OuterVolumeSpecName: "kube-api-access-9s25m") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "kube-api-access-9s25m". PluginName "kubernetes.io/projected", VolumeGidValue "" May 14 23:41:44.645291 systemd[1]: var-lib-kubelet-pods-025e0036\x2d3e02\x2d4104\x2d8cdd\x2dfedbf0774454-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9s25m.mount: Deactivated successfully. May 14 23:41:44.645433 systemd[1]: var-lib-kubelet-pods-025e0036\x2d3e02\x2d4104\x2d8cdd\x2dfedbf0774454-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. May 14 23:41:44.650627 kubelet[2831]: I0514 23:41:44.650536 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.650627 kubelet[2831]: I0514 23:41:44.650571 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-policysync" (OuterVolumeSpecName: "policysync") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.650627 kubelet[2831]: I0514 23:41:44.650586 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" May 14 23:41:44.668552 kubelet[2831]: I0514 23:41:44.668481 2831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/025e0036-3e02-4104-8cdd-fedbf0774454-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "025e0036-3e02-4104-8cdd-fedbf0774454" (UID: "025e0036-3e02-4104-8cdd-fedbf0774454"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" May 14 23:41:44.669679 systemd[1]: var-lib-kubelet-pods-025e0036\x2d3e02\x2d4104\x2d8cdd\x2dfedbf0774454-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744108 2831 reconciler_common.go:288] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-net-dir\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744136 2831 reconciler_common.go:288] "Volume detached for volume \"kube-api-access-9s25m\" (UniqueName: \"kubernetes.io/projected/025e0036-3e02-4104-8cdd-fedbf0774454-kube-api-access-9s25m\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744145 2831 reconciler_common.go:288] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744150 2831 reconciler_common.go:288] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/025e0036-3e02-4104-8cdd-fedbf0774454-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744155 2831 reconciler_common.go:288] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-policysync\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744160 2831 reconciler_common.go:288] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-run-calico\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744165 2831 reconciler_common.go:288] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-cni-log-dir\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744162 kubelet[2831]: I0514 23:41:44.744170 2831 reconciler_common.go:288] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-lib-modules\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744458 kubelet[2831]: I0514 23:41:44.744175 2831 reconciler_common.go:288] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/025e0036-3e02-4104-8cdd-fedbf0774454-node-certs\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744458 kubelet[2831]: I0514 23:41:44.744181 2831 reconciler_common.go:288] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-var-lib-calico\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744458 kubelet[2831]: I0514 23:41:44.744186 2831 reconciler_common.go:288] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.744458 kubelet[2831]: I0514 23:41:44.744190 2831 reconciler_common.go:288] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/025e0036-3e02-4104-8cdd-fedbf0774454-xtables-lock\") on node \"localhost\" DevicePath \"\"" May 14 23:41:44.853353 containerd[1574]: time="2025-05-14T23:41:44.853226619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-k48fw,Uid:df9f0f41-c17f-41de-914d-86da5486109f,Namespace:calico-system,Attempt:0,}" May 14 23:41:44.866478 containerd[1574]: time="2025-05-14T23:41:44.866445763Z" level=info msg="connecting to shim 98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d" address="unix:///run/containerd/s/bd2d912d2865f87a91e0c3f1b2555c2e874c55e5ad1df5a653343e98f951d959" namespace=k8s.io protocol=ttrpc version=3 May 14 23:41:44.883427 systemd[1]: Started cri-containerd-98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d.scope - libcontainer container 98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d. May 14 23:41:44.903102 containerd[1574]: time="2025-05-14T23:41:44.903074676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-k48fw,Uid:df9f0f41-c17f-41de-914d-86da5486109f,Namespace:calico-system,Attempt:0,} returns sandbox id \"98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d\"" May 14 23:41:44.904921 containerd[1574]: time="2025-05-14T23:41:44.904871808Z" level=info msg="CreateContainer within sandbox \"98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 14 23:41:44.910095 containerd[1574]: time="2025-05-14T23:41:44.910010711Z" level=info msg="Container 2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:44.914262 containerd[1574]: time="2025-05-14T23:41:44.914242697Z" level=info msg="CreateContainer within sandbox \"98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c\"" May 14 23:41:44.915150 containerd[1574]: time="2025-05-14T23:41:44.915137397Z" level=info msg="StartContainer for \"2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c\"" May 14 23:41:44.916236 containerd[1574]: time="2025-05-14T23:41:44.916213657Z" level=info msg="connecting to shim 2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c" address="unix:///run/containerd/s/bd2d912d2865f87a91e0c3f1b2555c2e874c55e5ad1df5a653343e98f951d959" protocol=ttrpc version=3 May 14 23:41:44.928413 systemd[1]: Started cri-containerd-2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c.scope - libcontainer container 2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c. May 14 23:41:44.951127 kubelet[2831]: I0514 23:41:44.949595 2831 scope.go:117] "RemoveContainer" containerID="439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e" May 14 23:41:44.953318 systemd[1]: Removed slice kubepods-besteffort-pod025e0036_3e02_4104_8cdd_fedbf0774454.slice - libcontainer container kubepods-besteffort-pod025e0036_3e02_4104_8cdd_fedbf0774454.slice. May 14 23:41:44.953381 systemd[1]: kubepods-besteffort-pod025e0036_3e02_4104_8cdd_fedbf0774454.slice: Consumed 522ms CPU time, 151M memory peak, 21.2M read from disk, 160.4M written to disk. May 14 23:41:44.956168 containerd[1574]: time="2025-05-14T23:41:44.956141848Z" level=info msg="RemoveContainer for \"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\"" May 14 23:41:44.960593 containerd[1574]: time="2025-05-14T23:41:44.960574224Z" level=info msg="RemoveContainer for \"439fd08a05beb882e787616a7b9288888cef8432efe49ac8da0816cfd2c64c5e\" returns successfully" May 14 23:41:44.960828 kubelet[2831]: I0514 23:41:44.960813 2831 scope.go:117] "RemoveContainer" containerID="27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2" May 14 23:41:44.962288 containerd[1574]: time="2025-05-14T23:41:44.962265203Z" level=info msg="RemoveContainer for \"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\"" May 14 23:41:44.968747 containerd[1574]: time="2025-05-14T23:41:44.968717793Z" level=info msg="StartContainer for \"2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c\" returns successfully" May 14 23:41:44.969619 containerd[1574]: time="2025-05-14T23:41:44.969529511Z" level=info msg="RemoveContainer for \"27a7873a1bbcc5eabe574656f676f848878dcc0f30a5ec6d04b7d726da07f6f2\" returns successfully" May 14 23:41:44.969912 kubelet[2831]: I0514 23:41:44.969887 2831 scope.go:117] "RemoveContainer" containerID="b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42" May 14 23:41:44.973762 containerd[1574]: time="2025-05-14T23:41:44.973745043Z" level=info msg="RemoveContainer for \"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\"" May 14 23:41:44.977328 containerd[1574]: time="2025-05-14T23:41:44.976629846Z" level=info msg="RemoveContainer for \"b7b7c19c5a31f4ce6bb375ecf315de2b9d0a35d02b5d34ef3d26f31ba7002e42\" returns successfully" May 14 23:41:45.028737 systemd[1]: cri-containerd-2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c.scope: Deactivated successfully. May 14 23:41:45.028912 systemd[1]: cri-containerd-2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c.scope: Consumed 25ms CPU time, 17.8M memory peak, 9.8M read from disk, 6.3M written to disk. May 14 23:41:45.030778 containerd[1574]: time="2025-05-14T23:41:45.030751665Z" level=info msg="received exit event container_id:\"2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c\" id:\"2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c\" pid:4663 exited_at:{seconds:1747266105 nanos:30616516}" May 14 23:41:45.030934 containerd[1574]: time="2025-05-14T23:41:45.030880917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c\" id:\"2c47360d02f0567ed1e5e5814bbfb8aeb036460a31baf55ea147a04eb1a68c5c\" pid:4663 exited_at:{seconds:1747266105 nanos:30616516}" May 14 23:41:45.512275 kubelet[2831]: I0514 23:41:45.512007 2831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025e0036-3e02-4104-8cdd-fedbf0774454" path="/var/lib/kubelet/pods/025e0036-3e02-4104-8cdd-fedbf0774454/volumes" May 14 23:41:45.955615 containerd[1574]: time="2025-05-14T23:41:45.954622151Z" level=info msg="CreateContainer within sandbox \"98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 14 23:41:45.996068 containerd[1574]: time="2025-05-14T23:41:45.995245140Z" level=info msg="Container b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:46.022017 containerd[1574]: time="2025-05-14T23:41:46.021987735Z" level=info msg="CreateContainer within sandbox \"98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d\"" May 14 23:41:46.022683 containerd[1574]: time="2025-05-14T23:41:46.022378121Z" level=info msg="StartContainer for \"b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d\"" May 14 23:41:46.024397 containerd[1574]: time="2025-05-14T23:41:46.024217375Z" level=info msg="connecting to shim b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d" address="unix:///run/containerd/s/bd2d912d2865f87a91e0c3f1b2555c2e874c55e5ad1df5a653343e98f951d959" protocol=ttrpc version=3 May 14 23:41:46.043461 systemd[1]: Started cri-containerd-b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d.scope - libcontainer container b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d. May 14 23:41:46.069569 containerd[1574]: time="2025-05-14T23:41:46.069518108Z" level=info msg="StartContainer for \"b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d\" returns successfully" May 14 23:41:47.331111 systemd[1]: cri-containerd-b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d.scope: Deactivated successfully. May 14 23:41:47.331558 containerd[1574]: time="2025-05-14T23:41:47.331516989Z" level=info msg="received exit event container_id:\"b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d\" id:\"b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d\" pid:4712 exited_at:{seconds:1747266107 nanos:331327456}" May 14 23:41:47.332060 containerd[1574]: time="2025-05-14T23:41:47.331709351Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d\" id:\"b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d\" pid:4712 exited_at:{seconds:1747266107 nanos:331327456}" May 14 23:41:47.331747 systemd[1]: cri-containerd-b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d.scope: Consumed 496ms CPU time, 145.7M memory peak, 130.6M read from disk. May 14 23:41:47.350623 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b031efa5ebbf638bd1b4701281e4ca16c50aa470e10229e9dd9e883125e6737d-rootfs.mount: Deactivated successfully. May 14 23:41:47.511233 containerd[1574]: time="2025-05-14T23:41:47.511210326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:47.511543 containerd[1574]: time="2025-05-14T23:41:47.511234188Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,}" May 14 23:41:47.572825 containerd[1574]: time="2025-05-14T23:41:47.572784308Z" level=error msg="Failed to destroy network for sandbox \"09b2a91d49717c6b71088c8afe951e9a6ac44deecb9dae7d7707992c3a26f768\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:47.577493 containerd[1574]: time="2025-05-14T23:41:47.576209527Z" level=error msg="Failed to destroy network for sandbox \"270ee1d79af676ebdc103a966400ba52e26d73699e13412989de68e752643401\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:47.574132 systemd[1]: run-netns-cni\x2d7981a6ce\x2d5ba4\x2d13bc\x2d99eb\x2d97a0ed59f72d.mount: Deactivated successfully. May 14 23:41:47.577391 systemd[1]: run-netns-cni\x2d92c47d8c\x2d1c79\x2d0f6f\x2def7c\x2daa6aa7bda6ec.mount: Deactivated successfully. May 14 23:41:47.577977 containerd[1574]: time="2025-05-14T23:41:47.577667907Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"09b2a91d49717c6b71088c8afe951e9a6ac44deecb9dae7d7707992c3a26f768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:47.578037 kubelet[2831]: E0514 23:41:47.577959 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09b2a91d49717c6b71088c8afe951e9a6ac44deecb9dae7d7707992c3a26f768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:47.578037 kubelet[2831]: E0514 23:41:47.577998 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09b2a91d49717c6b71088c8afe951e9a6ac44deecb9dae7d7707992c3a26f768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:47.578037 kubelet[2831]: E0514 23:41:47.578014 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09b2a91d49717c6b71088c8afe951e9a6ac44deecb9dae7d7707992c3a26f768\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" May 14 23:41:47.578352 kubelet[2831]: E0514 23:41:47.578040 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-549c9ddc47-k62bq_calico-apiserver(18a843ce-b18e-4959-8f6e-0f50bd293efe)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"09b2a91d49717c6b71088c8afe951e9a6ac44deecb9dae7d7707992c3a26f768\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" podUID="18a843ce-b18e-4959-8f6e-0f50bd293efe" May 14 23:41:47.589171 containerd[1574]: time="2025-05-14T23:41:47.589077602Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"270ee1d79af676ebdc103a966400ba52e26d73699e13412989de68e752643401\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:47.589566 kubelet[2831]: E0514 23:41:47.589542 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"270ee1d79af676ebdc103a966400ba52e26d73699e13412989de68e752643401\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:47.589633 kubelet[2831]: E0514 23:41:47.589578 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"270ee1d79af676ebdc103a966400ba52e26d73699e13412989de68e752643401\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:47.589633 kubelet[2831]: E0514 23:41:47.589592 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"270ee1d79af676ebdc103a966400ba52e26d73699e13412989de68e752643401\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-w9hfg" May 14 23:41:47.589633 kubelet[2831]: E0514 23:41:47.589618 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-w9hfg_kube-system(5e86d3e4-1ed5-4ded-9be6-007462b12e77)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"270ee1d79af676ebdc103a966400ba52e26d73699e13412989de68e752643401\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-w9hfg" podUID="5e86d3e4-1ed5-4ded-9be6-007462b12e77" May 14 23:41:48.121853 containerd[1574]: time="2025-05-14T23:41:48.121825812Z" level=info msg="CreateContainer within sandbox \"98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 14 23:41:48.159703 containerd[1574]: time="2025-05-14T23:41:48.159606709Z" level=info msg="Container b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:48.188895 containerd[1574]: time="2025-05-14T23:41:48.188866094Z" level=info msg="CreateContainer within sandbox \"98969532f16af2ff987f86380d82697b464fcf1c890d0d6ad9904bd57e0abb5d\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736\"" May 14 23:41:48.189711 containerd[1574]: time="2025-05-14T23:41:48.189255197Z" level=info msg="StartContainer for \"b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736\"" May 14 23:41:48.190361 containerd[1574]: time="2025-05-14T23:41:48.190347713Z" level=info msg="connecting to shim b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736" address="unix:///run/containerd/s/bd2d912d2865f87a91e0c3f1b2555c2e874c55e5ad1df5a653343e98f951d959" protocol=ttrpc version=3 May 14 23:41:48.206396 systemd[1]: Started cri-containerd-b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736.scope - libcontainer container b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736. May 14 23:41:48.243571 containerd[1574]: time="2025-05-14T23:41:48.243504069Z" level=info msg="StartContainer for \"b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736\" returns successfully" May 14 23:41:48.511997 containerd[1574]: time="2025-05-14T23:41:48.511717840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,}" May 14 23:41:48.556336 containerd[1574]: time="2025-05-14T23:41:48.556289592Z" level=error msg="Failed to destroy network for sandbox \"a3c5146e32a84e24b51fa7a5701c84a83e195698b4ac4ac2e3768681c5b72b04\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:48.557603 systemd[1]: run-netns-cni\x2d5687267a\x2de666\x2d09c9\x2d35b9\x2dbf28817ebc8e.mount: Deactivated successfully. May 14 23:41:48.559175 containerd[1574]: time="2025-05-14T23:41:48.559148607Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c5146e32a84e24b51fa7a5701c84a83e195698b4ac4ac2e3768681c5b72b04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:48.559848 kubelet[2831]: E0514 23:41:48.559303 2831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c5146e32a84e24b51fa7a5701c84a83e195698b4ac4ac2e3768681c5b72b04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 23:41:48.559848 kubelet[2831]: E0514 23:41:48.559551 2831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c5146e32a84e24b51fa7a5701c84a83e195698b4ac4ac2e3768681c5b72b04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:48.559848 kubelet[2831]: E0514 23:41:48.559566 2831 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c5146e32a84e24b51fa7a5701c84a83e195698b4ac4ac2e3768681c5b72b04\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" May 14 23:41:48.560171 kubelet[2831]: E0514 23:41:48.559596 2831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7cd45688fd-h8dqc_calico-system(26bdfeca-0687-46a9-90a0-450c95fd195f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3c5146e32a84e24b51fa7a5701c84a83e195698b4ac4ac2e3768681c5b72b04\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" podUID="26bdfeca-0687-46a9-90a0-450c95fd195f" May 14 23:41:48.936647 systemd[1]: Started sshd@8-139.178.70.107:22-147.75.109.163:49358.service - OpenSSH per-connection server daemon (147.75.109.163:49358). May 14 23:41:48.993729 kubelet[2831]: I0514 23:41:48.993629 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-k48fw" podStartSLOduration=4.993612356 podStartE2EDuration="4.993612356s" podCreationTimestamp="2025-05-14 23:41:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:41:48.993148313 +0000 UTC m=+75.625949980" watchObservedRunningTime="2025-05-14 23:41:48.993612356 +0000 UTC m=+75.626414023" May 14 23:41:49.045914 containerd[1574]: time="2025-05-14T23:41:49.045885896Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736\" id:\"04f141431cf54377170517a346a51b7debaf64b07860c614de990c0ad79d545e\" pid:4902 exit_status:1 exited_at:{seconds:1747266109 nanos:45674108}" May 14 23:41:49.438083 sshd[4888]: Accepted publickey for core from 147.75.109.163 port 49358 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:41:49.438469 sshd-session[4888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:41:49.448516 systemd-logind[1552]: New session 11 of user core. May 14 23:41:49.457429 systemd[1]: Started session-11.scope - Session 11 of User core. May 14 23:41:50.133666 containerd[1574]: time="2025-05-14T23:41:50.133548491Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736\" id:\"24d4d60e6949f0777481c52111db1eef26f3136c0414a21e153c74e5fb8dc321\" pid:5030 exit_status:1 exited_at:{seconds:1747266110 nanos:132709885}" May 14 23:41:50.463340 kernel: bpftool[5073]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 14 23:41:50.721707 sshd[4914]: Connection closed by 147.75.109.163 port 49358 May 14 23:41:50.722765 sshd-session[4888]: pam_unix(sshd:session): session closed for user core May 14 23:41:50.726137 systemd-logind[1552]: Session 11 logged out. Waiting for processes to exit. May 14 23:41:50.726589 systemd[1]: sshd@8-139.178.70.107:22-147.75.109.163:49358.service: Deactivated successfully. May 14 23:41:50.728247 systemd[1]: session-11.scope: Deactivated successfully. May 14 23:41:50.728869 systemd-logind[1552]: Removed session 11. May 14 23:41:50.805471 systemd-networkd[1463]: vxlan.calico: Link UP May 14 23:41:50.805476 systemd-networkd[1463]: vxlan.calico: Gained carrier May 14 23:41:51.512126 containerd[1574]: time="2025-05-14T23:41:51.511521272Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,}" May 14 23:41:51.512596 containerd[1574]: time="2025-05-14T23:41:51.511521360Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,}" May 14 23:41:51.512929 containerd[1574]: time="2025-05-14T23:41:51.512895499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:52.115538 systemd-networkd[1463]: vxlan.calico: Gained IPv6LL May 14 23:41:52.405406 systemd-networkd[1463]: cali825e528ba9d: Link UP May 14 23:41:52.406298 systemd-networkd[1463]: cali825e528ba9d: Gained carrier May 14 23:41:52.420628 containerd[1574]: 2025-05-14 23:41:52.333 [INFO][5160] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0 coredns-6f6b679f8f- kube-system 91fe5f31-0539-49b6-83f5-1dfaad928a0a 674 0 2025-05-14 23:40:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-rnrcm eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali825e528ba9d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-" May 14 23:41:52.420628 containerd[1574]: 2025-05-14 23:41:52.333 [INFO][5160] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" May 14 23:41:52.420628 containerd[1574]: 2025-05-14 23:41:52.363 [INFO][5208] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" HandleID="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Workload="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.375 [INFO][5208] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" HandleID="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Workload="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031d4f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-rnrcm", "timestamp":"2025-05-14 23:41:52.363529757 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.375 [INFO][5208] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.376 [INFO][5208] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.376 [INFO][5208] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.379 [INFO][5208] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" host="localhost" May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.384 [INFO][5208] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.387 [INFO][5208] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.388 [INFO][5208] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.389 [INFO][5208] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 14 23:41:52.420821 containerd[1574]: 2025-05-14 23:41:52.389 [INFO][5208] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" host="localhost" May 14 23:41:52.422758 containerd[1574]: 2025-05-14 23:41:52.390 [INFO][5208] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b May 14 23:41:52.422758 containerd[1574]: 2025-05-14 23:41:52.392 [INFO][5208] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" host="localhost" May 14 23:41:52.422758 containerd[1574]: 2025-05-14 23:41:52.399 [INFO][5208] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" host="localhost" May 14 23:41:52.422758 containerd[1574]: 2025-05-14 23:41:52.399 [INFO][5208] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" host="localhost" May 14 23:41:52.422758 containerd[1574]: 2025-05-14 23:41:52.399 [INFO][5208] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 23:41:52.422758 containerd[1574]: 2025-05-14 23:41:52.399 [INFO][5208] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" HandleID="k8s-pod-network.44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Workload="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" May 14 23:41:52.422927 containerd[1574]: 2025-05-14 23:41:52.401 [INFO][5160] cni-plugin/k8s.go 386: Populated endpoint ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"91fe5f31-0539-49b6-83f5-1dfaad928a0a", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-rnrcm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali825e528ba9d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:52.423000 containerd[1574]: 2025-05-14 23:41:52.401 [INFO][5160] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" May 14 23:41:52.423000 containerd[1574]: 2025-05-14 23:41:52.401 [INFO][5160] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali825e528ba9d ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" May 14 23:41:52.423000 containerd[1574]: 2025-05-14 23:41:52.406 [INFO][5160] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" May 14 23:41:52.423857 containerd[1574]: 2025-05-14 23:41:52.406 [INFO][5160] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"91fe5f31-0539-49b6-83f5-1dfaad928a0a", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b", Pod:"coredns-6f6b679f8f-rnrcm", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali825e528ba9d", MAC:"0e:1b:d4:e3:a1:a5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:52.423857 containerd[1574]: 2025-05-14 23:41:52.417 [INFO][5160] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" Namespace="kube-system" Pod="coredns-6f6b679f8f-rnrcm" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--rnrcm-eth0" May 14 23:41:52.447265 containerd[1574]: time="2025-05-14T23:41:52.447217187Z" level=info msg="connecting to shim 44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b" address="unix:///run/containerd/s/440fed05d00db4b7442d3ff40b53a5dba18df8d7ec6dc4ef97874f1b483ac28e" namespace=k8s.io protocol=ttrpc version=3 May 14 23:41:52.469474 systemd[1]: Started cri-containerd-44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b.scope - libcontainer container 44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b. May 14 23:41:52.478117 systemd-resolved[1464]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 14 23:41:52.511460 systemd-networkd[1463]: cali3576cacc109: Link UP May 14 23:41:52.511839 systemd-networkd[1463]: cali3576cacc109: Gained carrier May 14 23:41:52.525810 containerd[1574]: time="2025-05-14T23:41:52.525785035Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-rnrcm,Uid:91fe5f31-0539-49b6-83f5-1dfaad928a0a,Namespace:kube-system,Attempt:0,} returns sandbox id \"44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b\"" May 14 23:41:52.527523 containerd[1574]: time="2025-05-14T23:41:52.527503730Z" level=info msg="CreateContainer within sandbox \"44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.333 [INFO][5163] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--9zp9w-eth0 csi-node-driver- calico-system 018bb5c2-b3b5-4cf8-a7ea-a530d2470442 583 0 2025-05-14 23:40:44 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-9zp9w eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali3576cacc109 [] []}} ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.333 [INFO][5163] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-eth0" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.368 [INFO][5203] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" HandleID="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Workload="localhost-k8s-csi--node--driver--9zp9w-eth0" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.384 [INFO][5203] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" HandleID="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Workload="localhost-k8s-csi--node--driver--9zp9w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290b20), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-9zp9w", "timestamp":"2025-05-14 23:41:52.367792059 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.384 [INFO][5203] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.399 [INFO][5203] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.399 [INFO][5203] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.478 [INFO][5203] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.485 [INFO][5203] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.491 [INFO][5203] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.492 [INFO][5203] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.494 [INFO][5203] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.494 [INFO][5203] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.495 [INFO][5203] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8 May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.498 [INFO][5203] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.505 [INFO][5203] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.506 [INFO][5203] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" host="localhost" May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.506 [INFO][5203] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 23:41:52.536491 containerd[1574]: 2025-05-14 23:41:52.506 [INFO][5203] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" HandleID="k8s-pod-network.8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Workload="localhost-k8s-csi--node--driver--9zp9w-eth0" May 14 23:41:52.546086 containerd[1574]: 2025-05-14 23:41:52.508 [INFO][5163] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--9zp9w-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"018bb5c2-b3b5-4cf8-a7ea-a530d2470442", ResourceVersion:"583", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-9zp9w", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3576cacc109", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:52.546086 containerd[1574]: 2025-05-14 23:41:52.509 [INFO][5163] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-eth0" May 14 23:41:52.546086 containerd[1574]: 2025-05-14 23:41:52.509 [INFO][5163] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3576cacc109 ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-eth0" May 14 23:41:52.546086 containerd[1574]: 2025-05-14 23:41:52.510 [INFO][5163] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-eth0" May 14 23:41:52.546086 containerd[1574]: 2025-05-14 23:41:52.512 [INFO][5163] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--9zp9w-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"018bb5c2-b3b5-4cf8-a7ea-a530d2470442", ResourceVersion:"583", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8", Pod:"csi-node-driver-9zp9w", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali3576cacc109", MAC:"22:09:74:ac:fe:42", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:52.546086 containerd[1574]: 2025-05-14 23:41:52.534 [INFO][5163] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" Namespace="calico-system" Pod="csi-node-driver-9zp9w" WorkloadEndpoint="localhost-k8s-csi--node--driver--9zp9w-eth0" May 14 23:41:52.574622 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3059020451.mount: Deactivated successfully. May 14 23:41:52.579247 containerd[1574]: time="2025-05-14T23:41:52.579213288Z" level=info msg="Container ee5633846ff788f3e469817d7f89f43e9156e58c4aa22723da398ef7a725f3d4: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:52.581881 containerd[1574]: time="2025-05-14T23:41:52.581855849Z" level=info msg="connecting to shim 8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8" address="unix:///run/containerd/s/cd15f684b501a34a83bf861c135e267a0a1a6135471f5b9db05b6cb284d00fb1" namespace=k8s.io protocol=ttrpc version=3 May 14 23:41:52.587896 containerd[1574]: time="2025-05-14T23:41:52.587854431Z" level=info msg="CreateContainer within sandbox \"44a4a914fd5854f6abee5ef4cdc84585127177a32c325f20cd5452b774771b4b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"ee5633846ff788f3e469817d7f89f43e9156e58c4aa22723da398ef7a725f3d4\"" May 14 23:41:52.589669 containerd[1574]: time="2025-05-14T23:41:52.589117600Z" level=info msg="StartContainer for \"ee5633846ff788f3e469817d7f89f43e9156e58c4aa22723da398ef7a725f3d4\"" May 14 23:41:52.590829 containerd[1574]: time="2025-05-14T23:41:52.590809805Z" level=info msg="connecting to shim ee5633846ff788f3e469817d7f89f43e9156e58c4aa22723da398ef7a725f3d4" address="unix:///run/containerd/s/440fed05d00db4b7442d3ff40b53a5dba18df8d7ec6dc4ef97874f1b483ac28e" protocol=ttrpc version=3 May 14 23:41:52.611488 systemd[1]: Started cri-containerd-8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8.scope - libcontainer container 8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8. May 14 23:41:52.613320 systemd[1]: Started cri-containerd-ee5633846ff788f3e469817d7f89f43e9156e58c4aa22723da398ef7a725f3d4.scope - libcontainer container ee5633846ff788f3e469817d7f89f43e9156e58c4aa22723da398ef7a725f3d4. May 14 23:41:52.620549 systemd-networkd[1463]: cali0cbd3e40683: Link UP May 14 23:41:52.620788 systemd-networkd[1463]: cali0cbd3e40683: Gained carrier May 14 23:41:52.630413 systemd-resolved[1464]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.333 [INFO][5169] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0 calico-apiserver-549c9ddc47- calico-apiserver b16fe825-67e1-4ddb-b1d8-e5a537b879ea 673 0 2025-05-14 23:40:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:549c9ddc47 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-549c9ddc47-qvrrc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0cbd3e40683 [] []}} ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.334 [INFO][5169] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.377 [INFO][5210] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" HandleID="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Workload="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.385 [INFO][5210] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" HandleID="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Workload="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031cf20), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-549c9ddc47-qvrrc", "timestamp":"2025-05-14 23:41:52.377918152 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.385 [INFO][5210] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.506 [INFO][5210] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.506 [INFO][5210] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.580 [INFO][5210] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.585 [INFO][5210] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.594 [INFO][5210] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.596 [INFO][5210] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.600 [INFO][5210] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.600 [INFO][5210] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.602 [INFO][5210] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.608 [INFO][5210] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.615 [INFO][5210] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.615 [INFO][5210] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" host="localhost" May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.615 [INFO][5210] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 23:41:52.643734 containerd[1574]: 2025-05-14 23:41:52.615 [INFO][5210] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" HandleID="k8s-pod-network.36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Workload="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" May 14 23:41:52.645076 containerd[1574]: 2025-05-14 23:41:52.618 [INFO][5169] cni-plugin/k8s.go 386: Populated endpoint ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0", GenerateName:"calico-apiserver-549c9ddc47-", Namespace:"calico-apiserver", SelfLink:"", UID:"b16fe825-67e1-4ddb-b1d8-e5a537b879ea", ResourceVersion:"673", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549c9ddc47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-549c9ddc47-qvrrc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0cbd3e40683", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:52.645076 containerd[1574]: 2025-05-14 23:41:52.618 [INFO][5169] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" May 14 23:41:52.645076 containerd[1574]: 2025-05-14 23:41:52.618 [INFO][5169] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0cbd3e40683 ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" May 14 23:41:52.645076 containerd[1574]: 2025-05-14 23:41:52.620 [INFO][5169] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" May 14 23:41:52.645076 containerd[1574]: 2025-05-14 23:41:52.621 [INFO][5169] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0", GenerateName:"calico-apiserver-549c9ddc47-", Namespace:"calico-apiserver", SelfLink:"", UID:"b16fe825-67e1-4ddb-b1d8-e5a537b879ea", ResourceVersion:"673", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549c9ddc47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe", Pod:"calico-apiserver-549c9ddc47-qvrrc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0cbd3e40683", MAC:"6a:ab:f2:4e:c8:d6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:52.645076 containerd[1574]: 2025-05-14 23:41:52.634 [INFO][5169] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-qvrrc" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--qvrrc-eth0" May 14 23:41:52.653380 containerd[1574]: time="2025-05-14T23:41:52.653350425Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9zp9w,Uid:018bb5c2-b3b5-4cf8-a7ea-a530d2470442,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8\"" May 14 23:41:52.655778 containerd[1574]: time="2025-05-14T23:41:52.654779446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 14 23:41:52.680355 containerd[1574]: time="2025-05-14T23:41:52.680330628Z" level=info msg="StartContainer for \"ee5633846ff788f3e469817d7f89f43e9156e58c4aa22723da398ef7a725f3d4\" returns successfully" May 14 23:41:52.686326 containerd[1574]: time="2025-05-14T23:41:52.685941531Z" level=info msg="connecting to shim 36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe" address="unix:///run/containerd/s/3ce216621dbbc0ab64f59d16e0658b26cda22d7c73bf29a97113994185969edd" namespace=k8s.io protocol=ttrpc version=3 May 14 23:41:52.701406 systemd[1]: Started cri-containerd-36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe.scope - libcontainer container 36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe. May 14 23:41:52.710465 systemd-resolved[1464]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 14 23:41:52.742437 containerd[1574]: time="2025-05-14T23:41:52.742408969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-qvrrc,Uid:b16fe825-67e1-4ddb-b1d8-e5a537b879ea,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe\"" May 14 23:41:53.112071 kubelet[2831]: I0514 23:41:53.112029 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-rnrcm" podStartSLOduration=76.077508813 podStartE2EDuration="1m16.077508813s" podCreationTimestamp="2025-05-14 23:40:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:41:53.077381146 +0000 UTC m=+79.710182813" watchObservedRunningTime="2025-05-14 23:41:53.077508813 +0000 UTC m=+79.710310474" May 14 23:41:53.971406 systemd-networkd[1463]: cali0cbd3e40683: Gained IPv6LL May 14 23:41:54.035520 systemd-networkd[1463]: cali3576cacc109: Gained IPv6LL May 14 23:41:54.036129 systemd-networkd[1463]: cali825e528ba9d: Gained IPv6LL May 14 23:41:54.594817 containerd[1574]: time="2025-05-14T23:41:54.594404083Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:54.602014 containerd[1574]: time="2025-05-14T23:41:54.601982975Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 14 23:41:54.615898 containerd[1574]: time="2025-05-14T23:41:54.615879935Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:54.625871 containerd[1574]: time="2025-05-14T23:41:54.625818113Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:54.626599 containerd[1574]: time="2025-05-14T23:41:54.626246057Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 1.970533321s" May 14 23:41:54.626599 containerd[1574]: time="2025-05-14T23:41:54.626270143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 14 23:41:54.627077 containerd[1574]: time="2025-05-14T23:41:54.627061530Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 23:41:54.632493 containerd[1574]: time="2025-05-14T23:41:54.632461247Z" level=info msg="CreateContainer within sandbox \"8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 14 23:41:54.691611 containerd[1574]: time="2025-05-14T23:41:54.690519784Z" level=info msg="Container 2f0c55917c6998a9ca8e6aa57d9f08ae6c6fb0e82c1e588e9588edb31b10d11f: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:54.693529 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3539695450.mount: Deactivated successfully. May 14 23:41:54.700522 containerd[1574]: time="2025-05-14T23:41:54.700400948Z" level=info msg="CreateContainer within sandbox \"8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2f0c55917c6998a9ca8e6aa57d9f08ae6c6fb0e82c1e588e9588edb31b10d11f\"" May 14 23:41:54.702035 containerd[1574]: time="2025-05-14T23:41:54.701964515Z" level=info msg="StartContainer for \"2f0c55917c6998a9ca8e6aa57d9f08ae6c6fb0e82c1e588e9588edb31b10d11f\"" May 14 23:41:54.704011 containerd[1574]: time="2025-05-14T23:41:54.703922022Z" level=info msg="connecting to shim 2f0c55917c6998a9ca8e6aa57d9f08ae6c6fb0e82c1e588e9588edb31b10d11f" address="unix:///run/containerd/s/cd15f684b501a34a83bf861c135e267a0a1a6135471f5b9db05b6cb284d00fb1" protocol=ttrpc version=3 May 14 23:41:54.722489 systemd[1]: Started cri-containerd-2f0c55917c6998a9ca8e6aa57d9f08ae6c6fb0e82c1e588e9588edb31b10d11f.scope - libcontainer container 2f0c55917c6998a9ca8e6aa57d9f08ae6c6fb0e82c1e588e9588edb31b10d11f. May 14 23:41:54.786030 containerd[1574]: time="2025-05-14T23:41:54.785931452Z" level=info msg="StartContainer for \"2f0c55917c6998a9ca8e6aa57d9f08ae6c6fb0e82c1e588e9588edb31b10d11f\" returns successfully" May 14 23:41:55.731912 systemd[1]: Started sshd@9-139.178.70.107:22-147.75.109.163:49370.service - OpenSSH per-connection server daemon (147.75.109.163:49370). May 14 23:41:55.808489 sshd[5475]: Accepted publickey for core from 147.75.109.163 port 49370 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:41:55.809792 sshd-session[5475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:41:55.814548 systemd-logind[1552]: New session 12 of user core. May 14 23:41:55.822473 systemd[1]: Started session-12.scope - Session 12 of User core. May 14 23:41:56.901797 sshd[5477]: Connection closed by 147.75.109.163 port 49370 May 14 23:41:56.902301 sshd-session[5475]: pam_unix(sshd:session): session closed for user core May 14 23:41:56.908965 systemd[1]: sshd@9-139.178.70.107:22-147.75.109.163:49370.service: Deactivated successfully. May 14 23:41:56.910342 systemd[1]: session-12.scope: Deactivated successfully. May 14 23:41:56.910852 systemd-logind[1552]: Session 12 logged out. Waiting for processes to exit. May 14 23:41:56.912707 systemd[1]: Started sshd@10-139.178.70.107:22-147.75.109.163:49386.service - OpenSSH per-connection server daemon (147.75.109.163:49386). May 14 23:41:56.913684 systemd-logind[1552]: Removed session 12. May 14 23:41:56.949421 sshd[5497]: Accepted publickey for core from 147.75.109.163 port 49386 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:41:56.951009 sshd-session[5497]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:41:56.954407 systemd-logind[1552]: New session 13 of user core. May 14 23:41:56.959483 systemd[1]: Started session-13.scope - Session 13 of User core. May 14 23:41:57.216459 sshd[5500]: Connection closed by 147.75.109.163 port 49386 May 14 23:41:57.217390 sshd-session[5497]: pam_unix(sshd:session): session closed for user core May 14 23:41:57.227824 systemd[1]: sshd@10-139.178.70.107:22-147.75.109.163:49386.service: Deactivated successfully. May 14 23:41:57.231117 systemd[1]: session-13.scope: Deactivated successfully. May 14 23:41:57.235491 systemd-logind[1552]: Session 13 logged out. Waiting for processes to exit. May 14 23:41:57.239939 systemd[1]: Started sshd@11-139.178.70.107:22-147.75.109.163:49402.service - OpenSSH per-connection server daemon (147.75.109.163:49402). May 14 23:41:57.243661 systemd-logind[1552]: Removed session 13. May 14 23:41:57.321141 sshd[5510]: Accepted publickey for core from 147.75.109.163 port 49402 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:41:57.322261 sshd-session[5510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:41:57.325022 systemd-logind[1552]: New session 14 of user core. May 14 23:41:57.329422 systemd[1]: Started session-14.scope - Session 14 of User core. May 14 23:41:57.467658 sshd[5513]: Connection closed by 147.75.109.163 port 49402 May 14 23:41:57.467561 sshd-session[5510]: pam_unix(sshd:session): session closed for user core May 14 23:41:57.470943 systemd[1]: sshd@11-139.178.70.107:22-147.75.109.163:49402.service: Deactivated successfully. May 14 23:41:57.472227 systemd[1]: session-14.scope: Deactivated successfully. May 14 23:41:57.472908 systemd-logind[1552]: Session 14 logged out. Waiting for processes to exit. May 14 23:41:57.473584 systemd-logind[1552]: Removed session 14. May 14 23:41:58.554522 containerd[1574]: time="2025-05-14T23:41:58.554489562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,}" May 14 23:41:58.836901 containerd[1574]: time="2025-05-14T23:41:58.835762088Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:58.891881 containerd[1574]: time="2025-05-14T23:41:58.891839367Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 14 23:41:58.917978 containerd[1574]: time="2025-05-14T23:41:58.917414460Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:58.936922 containerd[1574]: time="2025-05-14T23:41:58.936687950Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:41:58.936922 containerd[1574]: time="2025-05-14T23:41:58.936890127Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 4.309756445s" May 14 23:41:58.936922 containerd[1574]: time="2025-05-14T23:41:58.936905508Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 23:41:58.959827 containerd[1574]: time="2025-05-14T23:41:58.959426966Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 14 23:41:58.986766 containerd[1574]: time="2025-05-14T23:41:58.985963820Z" level=info msg="CreateContainer within sandbox \"36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 23:41:59.107005 systemd-networkd[1463]: cali7380803dcbc: Link UP May 14 23:41:59.108205 systemd-networkd[1463]: cali7380803dcbc: Gained carrier May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.847 [INFO][5534] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0 calico-apiserver-549c9ddc47- calico-apiserver 18a843ce-b18e-4959-8f6e-0f50bd293efe 670 0 2025-05-14 23:40:43 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:549c9ddc47 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-549c9ddc47-k62bq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7380803dcbc [] []}} ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.875 [INFO][5534] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.948 [INFO][5542] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" HandleID="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Workload="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.959 [INFO][5542] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" HandleID="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Workload="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004d68f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-549c9ddc47-k62bq", "timestamp":"2025-05-14 23:41:58.948574049 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.959 [INFO][5542] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.959 [INFO][5542] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.959 [INFO][5542] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:58.977 [INFO][5542] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.058 [INFO][5542] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.077 [INFO][5542] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.078 [INFO][5542] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.087 [INFO][5542] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.087 [INFO][5542] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.088 [INFO][5542] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7 May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.091 [INFO][5542] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.103 [INFO][5542] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.103 [INFO][5542] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" host="localhost" May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.103 [INFO][5542] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 23:41:59.126766 containerd[1574]: 2025-05-14 23:41:59.103 [INFO][5542] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" HandleID="k8s-pod-network.042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Workload="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" May 14 23:41:59.141816 containerd[1574]: 2025-05-14 23:41:59.105 [INFO][5534] cni-plugin/k8s.go 386: Populated endpoint ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0", GenerateName:"calico-apiserver-549c9ddc47-", Namespace:"calico-apiserver", SelfLink:"", UID:"18a843ce-b18e-4959-8f6e-0f50bd293efe", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549c9ddc47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-549c9ddc47-k62bq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7380803dcbc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:59.141816 containerd[1574]: 2025-05-14 23:41:59.105 [INFO][5534] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" May 14 23:41:59.141816 containerd[1574]: 2025-05-14 23:41:59.105 [INFO][5534] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7380803dcbc ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" May 14 23:41:59.141816 containerd[1574]: 2025-05-14 23:41:59.107 [INFO][5534] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" May 14 23:41:59.141816 containerd[1574]: 2025-05-14 23:41:59.108 [INFO][5534] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0", GenerateName:"calico-apiserver-549c9ddc47-", Namespace:"calico-apiserver", SelfLink:"", UID:"18a843ce-b18e-4959-8f6e-0f50bd293efe", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 43, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"549c9ddc47", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7", Pod:"calico-apiserver-549c9ddc47-k62bq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7380803dcbc", MAC:"d2:82:c0:6e:93:6d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:41:59.141816 containerd[1574]: 2025-05-14 23:41:59.125 [INFO][5534] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" Namespace="calico-apiserver" Pod="calico-apiserver-549c9ddc47-k62bq" WorkloadEndpoint="localhost-k8s-calico--apiserver--549c9ddc47--k62bq-eth0" May 14 23:41:59.185440 containerd[1574]: time="2025-05-14T23:41:59.185409123Z" level=info msg="Container 6cdf6f6ebbd16a5f1af7ae994910a8ea447e684edf5caebd6492b3a93c18933d: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:59.282121 containerd[1574]: time="2025-05-14T23:41:59.282094905Z" level=info msg="CreateContainer within sandbox \"36d9a0935c5d683191449888e9465dbb70d8434653fb8c557281ecab55e1c5fe\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6cdf6f6ebbd16a5f1af7ae994910a8ea447e684edf5caebd6492b3a93c18933d\"" May 14 23:41:59.283029 containerd[1574]: time="2025-05-14T23:41:59.283007150Z" level=info msg="StartContainer for \"6cdf6f6ebbd16a5f1af7ae994910a8ea447e684edf5caebd6492b3a93c18933d\"" May 14 23:41:59.284483 containerd[1574]: time="2025-05-14T23:41:59.284458589Z" level=info msg="connecting to shim 6cdf6f6ebbd16a5f1af7ae994910a8ea447e684edf5caebd6492b3a93c18933d" address="unix:///run/containerd/s/3ce216621dbbc0ab64f59d16e0658b26cda22d7c73bf29a97113994185969edd" protocol=ttrpc version=3 May 14 23:41:59.388445 systemd[1]: Started cri-containerd-6cdf6f6ebbd16a5f1af7ae994910a8ea447e684edf5caebd6492b3a93c18933d.scope - libcontainer container 6cdf6f6ebbd16a5f1af7ae994910a8ea447e684edf5caebd6492b3a93c18933d. May 14 23:41:59.441108 containerd[1574]: time="2025-05-14T23:41:59.441015369Z" level=info msg="StartContainer for \"6cdf6f6ebbd16a5f1af7ae994910a8ea447e684edf5caebd6492b3a93c18933d\" returns successfully" May 14 23:41:59.548137 containerd[1574]: time="2025-05-14T23:41:59.547787372Z" level=info msg="connecting to shim 042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7" address="unix:///run/containerd/s/fb66ecd9b935eb6a98ac5ee414915b829f938e8ae7fa4da5ae92212aa32f4680" namespace=k8s.io protocol=ttrpc version=3 May 14 23:41:59.580489 systemd[1]: Started cri-containerd-042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7.scope - libcontainer container 042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7. May 14 23:41:59.594576 systemd-resolved[1464]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 14 23:41:59.792852 containerd[1574]: time="2025-05-14T23:41:59.792772421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-549c9ddc47-k62bq,Uid:18a843ce-b18e-4959-8f6e-0f50bd293efe,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7\"" May 14 23:41:59.813965 containerd[1574]: time="2025-05-14T23:41:59.813940197Z" level=info msg="CreateContainer within sandbox \"042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 23:41:59.860418 containerd[1574]: time="2025-05-14T23:41:59.859787837Z" level=info msg="Container 90ad3daceaca81e2d398b540c4a292cca71785bddf3e345f9e7c697ed1d4770e: CDI devices from CRI Config.CDIDevices: []" May 14 23:41:59.911849 containerd[1574]: time="2025-05-14T23:41:59.911826867Z" level=info msg="CreateContainer within sandbox \"042c33f10afbd5561cd91e63fb403853384ab3537af938b0383f4185e98790d7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"90ad3daceaca81e2d398b540c4a292cca71785bddf3e345f9e7c697ed1d4770e\"" May 14 23:41:59.912348 containerd[1574]: time="2025-05-14T23:41:59.912326574Z" level=info msg="StartContainer for \"90ad3daceaca81e2d398b540c4a292cca71785bddf3e345f9e7c697ed1d4770e\"" May 14 23:41:59.913346 containerd[1574]: time="2025-05-14T23:41:59.913326118Z" level=info msg="connecting to shim 90ad3daceaca81e2d398b540c4a292cca71785bddf3e345f9e7c697ed1d4770e" address="unix:///run/containerd/s/fb66ecd9b935eb6a98ac5ee414915b829f938e8ae7fa4da5ae92212aa32f4680" protocol=ttrpc version=3 May 14 23:41:59.929403 systemd[1]: Started cri-containerd-90ad3daceaca81e2d398b540c4a292cca71785bddf3e345f9e7c697ed1d4770e.scope - libcontainer container 90ad3daceaca81e2d398b540c4a292cca71785bddf3e345f9e7c697ed1d4770e. May 14 23:41:59.990419 containerd[1574]: time="2025-05-14T23:41:59.990390744Z" level=info msg="StartContainer for \"90ad3daceaca81e2d398b540c4a292cca71785bddf3e345f9e7c697ed1d4770e\" returns successfully" May 14 23:42:00.102985 kubelet[2831]: I0514 23:42:00.102867 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-549c9ddc47-qvrrc" podStartSLOduration=70.889670213 podStartE2EDuration="1m17.102850973s" podCreationTimestamp="2025-05-14 23:40:43 +0000 UTC" firstStartedPulling="2025-05-14 23:41:52.745142814 +0000 UTC m=+79.377944473" lastFinishedPulling="2025-05-14 23:41:58.958323573 +0000 UTC m=+85.591125233" observedRunningTime="2025-05-14 23:42:00.042060884 +0000 UTC m=+86.674862546" watchObservedRunningTime="2025-05-14 23:42:00.102850973 +0000 UTC m=+86.735652636" May 14 23:42:00.868868 kubelet[2831]: I0514 23:42:00.868781 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-549c9ddc47-k62bq" podStartSLOduration=77.868769594 podStartE2EDuration="1m17.868769594s" podCreationTimestamp="2025-05-14 23:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:42:00.103044358 +0000 UTC m=+86.735846026" watchObservedRunningTime="2025-05-14 23:42:00.868769594 +0000 UTC m=+87.501571262" May 14 23:42:01.011501 systemd-networkd[1463]: cali7380803dcbc: Gained IPv6LL May 14 23:42:01.570975 containerd[1574]: time="2025-05-14T23:42:01.570936845Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:42:01.582367 containerd[1574]: time="2025-05-14T23:42:01.582322275Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 14 23:42:01.602525 containerd[1574]: time="2025-05-14T23:42:01.602480731Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:42:01.620629 containerd[1574]: time="2025-05-14T23:42:01.620581138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:42:01.621154 containerd[1574]: time="2025-05-14T23:42:01.620865232Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.661413901s" May 14 23:42:01.621154 containerd[1574]: time="2025-05-14T23:42:01.620885960Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 14 23:42:01.622760 containerd[1574]: time="2025-05-14T23:42:01.622728857Z" level=info msg="CreateContainer within sandbox \"8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 14 23:42:01.700975 containerd[1574]: time="2025-05-14T23:42:01.700759921Z" level=info msg="Container 913e3b54222693c82dbd62d2efa637d2c3ea5ac6033b8772326e11d12891800a: CDI devices from CRI Config.CDIDevices: []" May 14 23:42:01.844868 containerd[1574]: time="2025-05-14T23:42:01.844746502Z" level=info msg="CreateContainer within sandbox \"8d731bd710c6dcdfc7d3afec244787c85ebba5ef97e318352bd66c8a206ac4e8\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"913e3b54222693c82dbd62d2efa637d2c3ea5ac6033b8772326e11d12891800a\"" May 14 23:42:01.845950 containerd[1574]: time="2025-05-14T23:42:01.845576655Z" level=info msg="StartContainer for \"913e3b54222693c82dbd62d2efa637d2c3ea5ac6033b8772326e11d12891800a\"" May 14 23:42:01.847148 containerd[1574]: time="2025-05-14T23:42:01.847127750Z" level=info msg="connecting to shim 913e3b54222693c82dbd62d2efa637d2c3ea5ac6033b8772326e11d12891800a" address="unix:///run/containerd/s/cd15f684b501a34a83bf861c135e267a0a1a6135471f5b9db05b6cb284d00fb1" protocol=ttrpc version=3 May 14 23:42:01.862389 systemd[1]: Started cri-containerd-913e3b54222693c82dbd62d2efa637d2c3ea5ac6033b8772326e11d12891800a.scope - libcontainer container 913e3b54222693c82dbd62d2efa637d2c3ea5ac6033b8772326e11d12891800a. May 14 23:42:01.918709 containerd[1574]: time="2025-05-14T23:42:01.918636853Z" level=info msg="StartContainer for \"913e3b54222693c82dbd62d2efa637d2c3ea5ac6033b8772326e11d12891800a\" returns successfully" May 14 23:42:02.476409 systemd[1]: Started sshd@12-139.178.70.107:22-147.75.109.163:46526.service - OpenSSH per-connection server daemon (147.75.109.163:46526). May 14 23:42:02.511985 containerd[1574]: time="2025-05-14T23:42:02.511715964Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,}" May 14 23:42:02.511985 containerd[1574]: time="2025-05-14T23:42:02.511737087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,}" May 14 23:42:02.778665 sshd[5734]: Accepted publickey for core from 147.75.109.163 port 46526 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:02.793419 sshd-session[5734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:02.810542 systemd-logind[1552]: New session 15 of user core. May 14 23:42:02.814432 systemd[1]: Started session-15.scope - Session 15 of User core. May 14 23:42:02.973759 systemd-networkd[1463]: cali35c9286c8cf: Link UP May 14 23:42:02.974599 systemd-networkd[1463]: cali35c9286c8cf: Gained carrier May 14 23:42:03.020849 kubelet[2831]: I0514 23:42:03.020536 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9zp9w" podStartSLOduration=70.053692527 podStartE2EDuration="1m19.020521189s" podCreationTimestamp="2025-05-14 23:40:44 +0000 UTC" firstStartedPulling="2025-05-14 23:41:52.654462537 +0000 UTC m=+79.287264195" lastFinishedPulling="2025-05-14 23:42:01.621291199 +0000 UTC m=+88.254092857" observedRunningTime="2025-05-14 23:42:02.022927524 +0000 UTC m=+88.655729192" watchObservedRunningTime="2025-05-14 23:42:03.020521189 +0000 UTC m=+89.653322850" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.731 [INFO][5740] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0 coredns-6f6b679f8f- kube-system 5e86d3e4-1ed5-4ded-9be6-007462b12e77 671 0 2025-05-14 23:40:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-w9hfg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali35c9286c8cf [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.742 [INFO][5740] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.814 [INFO][5759] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" HandleID="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Workload="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.828 [INFO][5759] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" HandleID="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Workload="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000284540), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-w9hfg", "timestamp":"2025-05-14 23:42:02.81448325 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.828 [INFO][5759] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.828 [INFO][5759] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.828 [INFO][5759] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.848 [INFO][5759] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.863 [INFO][5759] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.865 [INFO][5759] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.866 [INFO][5759] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.867 [INFO][5759] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.867 [INFO][5759] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.868 [INFO][5759] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17 May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.883 [INFO][5759] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.914 [INFO][5759] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.914 [INFO][5759] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" host="localhost" May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.914 [INFO][5759] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 23:42:03.022233 containerd[1574]: 2025-05-14 23:42:02.914 [INFO][5759] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" HandleID="k8s-pod-network.8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Workload="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" May 14 23:42:03.047282 containerd[1574]: 2025-05-14 23:42:02.916 [INFO][5740] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"5e86d3e4-1ed5-4ded-9be6-007462b12e77", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-w9hfg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali35c9286c8cf", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:42:03.047282 containerd[1574]: 2025-05-14 23:42:02.968 [INFO][5740] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" May 14 23:42:03.047282 containerd[1574]: 2025-05-14 23:42:02.969 [INFO][5740] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali35c9286c8cf ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" May 14 23:42:03.047282 containerd[1574]: 2025-05-14 23:42:02.973 [INFO][5740] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" May 14 23:42:03.047282 containerd[1574]: 2025-05-14 23:42:02.973 [INFO][5740] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"5e86d3e4-1ed5-4ded-9be6-007462b12e77", ResourceVersion:"671", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17", Pod:"coredns-6f6b679f8f-w9hfg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali35c9286c8cf", MAC:"ea:83:0d:9a:28:ee", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:42:03.047282 containerd[1574]: 2025-05-14 23:42:03.019 [INFO][5740] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" Namespace="kube-system" Pod="coredns-6f6b679f8f-w9hfg" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--w9hfg-eth0" May 14 23:42:03.032526 systemd-networkd[1463]: calidadb0f18492: Link UP May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.731 [INFO][5735] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0 calico-kube-controllers-7cd45688fd- calico-system 26bdfeca-0687-46a9-90a0-450c95fd195f 672 0 2025-05-14 23:40:44 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7cd45688fd projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7cd45688fd-h8dqc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidadb0f18492 [] []}} ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.742 [INFO][5735] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.829 [INFO][5761] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" HandleID="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Workload="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.848 [INFO][5761] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" HandleID="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Workload="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000278800), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7cd45688fd-h8dqc", "timestamp":"2025-05-14 23:42:02.829783835 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.848 [INFO][5761] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.914 [INFO][5761] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.914 [INFO][5761] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.946 [INFO][5761] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.950 [INFO][5761] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.966 [INFO][5761] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.967 [INFO][5761] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.970 [INFO][5761] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.970 [INFO][5761] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.972 [INFO][5761] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04 May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:02.987 [INFO][5761] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:03.028 [INFO][5761] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:03.028 [INFO][5761] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" host="localhost" May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:03.028 [INFO][5761] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 23:42:03.100521 containerd[1574]: 2025-05-14 23:42:03.028 [INFO][5761] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" HandleID="k8s-pod-network.9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Workload="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" May 14 23:42:03.032701 systemd-networkd[1463]: calidadb0f18492: Gained carrier May 14 23:42:03.111492 containerd[1574]: 2025-05-14 23:42:03.030 [INFO][5735] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0", GenerateName:"calico-kube-controllers-7cd45688fd-", Namespace:"calico-system", SelfLink:"", UID:"26bdfeca-0687-46a9-90a0-450c95fd195f", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cd45688fd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7cd45688fd-h8dqc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidadb0f18492", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:42:03.111492 containerd[1574]: 2025-05-14 23:42:03.030 [INFO][5735] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" May 14 23:42:03.111492 containerd[1574]: 2025-05-14 23:42:03.030 [INFO][5735] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidadb0f18492 ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" May 14 23:42:03.111492 containerd[1574]: 2025-05-14 23:42:03.032 [INFO][5735] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" May 14 23:42:03.111492 containerd[1574]: 2025-05-14 23:42:03.032 [INFO][5735] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0", GenerateName:"calico-kube-controllers-7cd45688fd-", Namespace:"calico-system", SelfLink:"", UID:"26bdfeca-0687-46a9-90a0-450c95fd195f", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 23, 40, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7cd45688fd", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04", Pod:"calico-kube-controllers-7cd45688fd-h8dqc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidadb0f18492", MAC:"9a:86:b0:68:5e:4c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 23:42:03.111492 containerd[1574]: 2025-05-14 23:42:03.073 [INFO][5735] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" Namespace="calico-system" Pod="calico-kube-controllers-7cd45688fd-h8dqc" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7cd45688fd--h8dqc-eth0" May 14 23:42:03.322232 containerd[1574]: time="2025-05-14T23:42:03.322138766Z" level=info msg="connecting to shim 8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17" address="unix:///run/containerd/s/aaf9e127c3f44ab7bd6fc4ddadb94621137a6fd8fb183848bef71ae675474f24" namespace=k8s.io protocol=ttrpc version=3 May 14 23:42:03.339833 containerd[1574]: time="2025-05-14T23:42:03.339778163Z" level=info msg="connecting to shim 9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04" address="unix:///run/containerd/s/f41d1a3913af10c0ba6a4c25061ac9d6a4aa64817c3127dc8f482a4b94f42c0e" namespace=k8s.io protocol=ttrpc version=3 May 14 23:42:03.342427 systemd[1]: Started cri-containerd-8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17.scope - libcontainer container 8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17. May 14 23:42:03.356590 systemd-resolved[1464]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 14 23:42:03.366452 systemd[1]: Started cri-containerd-9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04.scope - libcontainer container 9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04. May 14 23:42:03.386054 systemd-resolved[1464]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 14 23:42:03.397154 containerd[1574]: time="2025-05-14T23:42:03.396693613Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-w9hfg,Uid:5e86d3e4-1ed5-4ded-9be6-007462b12e77,Namespace:kube-system,Attempt:0,} returns sandbox id \"8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17\"" May 14 23:42:03.418536 containerd[1574]: time="2025-05-14T23:42:03.416783441Z" level=info msg="CreateContainer within sandbox \"8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 23:42:03.430251 containerd[1574]: time="2025-05-14T23:42:03.430029564Z" level=info msg="Container 751b885465466752344eb0a9292d999cc22f5fa0e6c994cda78abaabfaff1431: CDI devices from CRI Config.CDIDevices: []" May 14 23:42:03.431473 containerd[1574]: time="2025-05-14T23:42:03.430608410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7cd45688fd-h8dqc,Uid:26bdfeca-0687-46a9-90a0-450c95fd195f,Namespace:calico-system,Attempt:0,} returns sandbox id \"9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04\"" May 14 23:42:03.435928 containerd[1574]: time="2025-05-14T23:42:03.435826628Z" level=info msg="CreateContainer within sandbox \"8d36449a833c7f19c81417363fd979cc573f2281b89a371c585dc418adf2fe17\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"751b885465466752344eb0a9292d999cc22f5fa0e6c994cda78abaabfaff1431\"" May 14 23:42:03.440465 containerd[1574]: time="2025-05-14T23:42:03.440001543Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 14 23:42:03.440465 containerd[1574]: time="2025-05-14T23:42:03.440322617Z" level=info msg="StartContainer for \"751b885465466752344eb0a9292d999cc22f5fa0e6c994cda78abaabfaff1431\"" May 14 23:42:03.441630 containerd[1574]: time="2025-05-14T23:42:03.441609530Z" level=info msg="connecting to shim 751b885465466752344eb0a9292d999cc22f5fa0e6c994cda78abaabfaff1431" address="unix:///run/containerd/s/aaf9e127c3f44ab7bd6fc4ddadb94621137a6fd8fb183848bef71ae675474f24" protocol=ttrpc version=3 May 14 23:42:03.461787 systemd[1]: Started cri-containerd-751b885465466752344eb0a9292d999cc22f5fa0e6c994cda78abaabfaff1431.scope - libcontainer container 751b885465466752344eb0a9292d999cc22f5fa0e6c994cda78abaabfaff1431. May 14 23:42:03.498179 containerd[1574]: time="2025-05-14T23:42:03.498154011Z" level=info msg="StartContainer for \"751b885465466752344eb0a9292d999cc22f5fa0e6c994cda78abaabfaff1431\" returns successfully" May 14 23:42:03.759917 kubelet[2831]: I0514 23:42:03.758534 2831 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 14 23:42:03.769775 kubelet[2831]: I0514 23:42:03.769546 2831 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 14 23:42:03.918153 sshd[5771]: Connection closed by 147.75.109.163 port 46526 May 14 23:42:03.918630 sshd-session[5734]: pam_unix(sshd:session): session closed for user core May 14 23:42:03.920713 systemd-logind[1552]: Session 15 logged out. Waiting for processes to exit. May 14 23:42:03.921055 systemd[1]: sshd@12-139.178.70.107:22-147.75.109.163:46526.service: Deactivated successfully. May 14 23:42:03.922575 systemd[1]: session-15.scope: Deactivated successfully. May 14 23:42:03.923721 systemd-logind[1552]: Removed session 15. May 14 23:42:04.037114 kubelet[2831]: I0514 23:42:04.037026 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-w9hfg" podStartSLOduration=87.037012603 podStartE2EDuration="1m27.037012603s" podCreationTimestamp="2025-05-14 23:40:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 23:42:04.018136564 +0000 UTC m=+90.650938231" watchObservedRunningTime="2025-05-14 23:42:04.037012603 +0000 UTC m=+90.669814273" May 14 23:42:04.083396 systemd-networkd[1463]: calidadb0f18492: Gained IPv6LL May 14 23:42:04.595410 systemd-networkd[1463]: cali35c9286c8cf: Gained IPv6LL May 14 23:42:05.795205 containerd[1574]: time="2025-05-14T23:42:05.795170757Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:42:05.807658 containerd[1574]: time="2025-05-14T23:42:05.807481924Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 14 23:42:05.811960 containerd[1574]: time="2025-05-14T23:42:05.811813214Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:42:05.813135 containerd[1574]: time="2025-05-14T23:42:05.813108162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 23:42:05.815733 containerd[1574]: time="2025-05-14T23:42:05.813414420Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 2.373387091s" May 14 23:42:05.815733 containerd[1574]: time="2025-05-14T23:42:05.813441200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 14 23:42:05.900984 containerd[1574]: time="2025-05-14T23:42:05.900945373Z" level=info msg="CreateContainer within sandbox \"9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 14 23:42:05.908068 containerd[1574]: time="2025-05-14T23:42:05.908043685Z" level=info msg="Container a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35: CDI devices from CRI Config.CDIDevices: []" May 14 23:42:05.913418 containerd[1574]: time="2025-05-14T23:42:05.913397024Z" level=info msg="CreateContainer within sandbox \"9c98c02adf6e0620d569a618b2790015a0b8bf5d665281aa03751c0742b23f04\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35\"" May 14 23:42:05.913802 containerd[1574]: time="2025-05-14T23:42:05.913673046Z" level=info msg="StartContainer for \"a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35\"" May 14 23:42:05.914240 containerd[1574]: time="2025-05-14T23:42:05.914222207Z" level=info msg="connecting to shim a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35" address="unix:///run/containerd/s/f41d1a3913af10c0ba6a4c25061ac9d6a4aa64817c3127dc8f482a4b94f42c0e" protocol=ttrpc version=3 May 14 23:42:05.930412 systemd[1]: Started cri-containerd-a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35.scope - libcontainer container a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35. May 14 23:42:05.991026 containerd[1574]: time="2025-05-14T23:42:05.990958016Z" level=info msg="StartContainer for \"a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35\" returns successfully" May 14 23:42:06.518729 containerd[1574]: time="2025-05-14T23:42:06.518694785Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35\" id:\"ec071ccf4e644e36c0c5d6e4b3f153d57292a59385f11198125f323d19f9b6c5\" pid:5992 exited_at:{seconds:1747266126 nanos:473385278}" May 14 23:42:06.586692 kubelet[2831]: I0514 23:42:06.586409 2831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7cd45688fd-h8dqc" podStartSLOduration=80.177244617 podStartE2EDuration="1m22.586394528s" podCreationTimestamp="2025-05-14 23:40:44 +0000 UTC" firstStartedPulling="2025-05-14 23:42:03.43503969 +0000 UTC m=+90.067841349" lastFinishedPulling="2025-05-14 23:42:05.844189601 +0000 UTC m=+92.476991260" observedRunningTime="2025-05-14 23:42:06.334919388 +0000 UTC m=+92.967721056" watchObservedRunningTime="2025-05-14 23:42:06.586394528 +0000 UTC m=+93.219196191" May 14 23:42:08.941563 systemd[1]: Started sshd@13-139.178.70.107:22-147.75.109.163:58566.service - OpenSSH per-connection server daemon (147.75.109.163:58566). May 14 23:42:09.393425 sshd[6008]: Accepted publickey for core from 147.75.109.163 port 58566 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:09.394636 sshd-session[6008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:09.399358 systemd-logind[1552]: New session 16 of user core. May 14 23:42:09.404449 systemd[1]: Started session-16.scope - Session 16 of User core. May 14 23:42:10.271194 sshd[6010]: Connection closed by 147.75.109.163 port 58566 May 14 23:42:10.271682 sshd-session[6008]: pam_unix(sshd:session): session closed for user core May 14 23:42:10.275074 systemd[1]: sshd@13-139.178.70.107:22-147.75.109.163:58566.service: Deactivated successfully. May 14 23:42:10.277135 systemd[1]: session-16.scope: Deactivated successfully. May 14 23:42:10.278406 systemd-logind[1552]: Session 16 logged out. Waiting for processes to exit. May 14 23:42:10.279551 systemd-logind[1552]: Removed session 16. May 14 23:42:15.281852 systemd[1]: Started sshd@14-139.178.70.107:22-147.75.109.163:58580.service - OpenSSH per-connection server daemon (147.75.109.163:58580). May 14 23:42:15.413989 sshd[6047]: Accepted publickey for core from 147.75.109.163 port 58580 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:15.414933 sshd-session[6047]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:15.418242 systemd-logind[1552]: New session 17 of user core. May 14 23:42:15.422401 systemd[1]: Started session-17.scope - Session 17 of User core. May 14 23:42:15.722591 sshd[6053]: Connection closed by 147.75.109.163 port 58580 May 14 23:42:15.724667 sshd-session[6047]: pam_unix(sshd:session): session closed for user core May 14 23:42:15.734606 systemd[1]: sshd@14-139.178.70.107:22-147.75.109.163:58580.service: Deactivated successfully. May 14 23:42:15.737447 systemd[1]: session-17.scope: Deactivated successfully. May 14 23:42:15.739632 systemd-logind[1552]: Session 17 logged out. Waiting for processes to exit. May 14 23:42:15.743717 systemd-logind[1552]: Removed session 17. May 14 23:42:15.754931 containerd[1574]: time="2025-05-14T23:42:15.754899021Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b47c4202d924f31c85218be99e5dc9d40484703e8558160bf454f036b59a2736\" id:\"ed10e0d55e1d1bde005c646ea9830caa3949758417dd48f18d46a305859d00ad\" pid:6040 exited_at:{seconds:1747266135 nanos:754303977}" May 14 23:42:20.731692 systemd[1]: Started sshd@15-139.178.70.107:22-147.75.109.163:50820.service - OpenSSH per-connection server daemon (147.75.109.163:50820). May 14 23:42:20.802474 sshd[6068]: Accepted publickey for core from 147.75.109.163 port 50820 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:20.803593 sshd-session[6068]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:20.808483 systemd-logind[1552]: New session 18 of user core. May 14 23:42:20.814458 systemd[1]: Started session-18.scope - Session 18 of User core. May 14 23:42:21.091588 sshd[6070]: Connection closed by 147.75.109.163 port 50820 May 14 23:42:21.102489 systemd[1]: Started sshd@16-139.178.70.107:22-147.75.109.163:50824.service - OpenSSH per-connection server daemon (147.75.109.163:50824). May 14 23:42:21.092506 sshd-session[6068]: pam_unix(sshd:session): session closed for user core May 14 23:42:21.102744 systemd[1]: sshd@15-139.178.70.107:22-147.75.109.163:50820.service: Deactivated successfully. May 14 23:42:21.104127 systemd[1]: session-18.scope: Deactivated successfully. May 14 23:42:21.105627 systemd-logind[1552]: Session 18 logged out. Waiting for processes to exit. May 14 23:42:21.109270 systemd-logind[1552]: Removed session 18. May 14 23:42:21.136314 sshd[6079]: Accepted publickey for core from 147.75.109.163 port 50824 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:21.137114 sshd-session[6079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:21.140051 systemd-logind[1552]: New session 19 of user core. May 14 23:42:21.146456 systemd[1]: Started session-19.scope - Session 19 of User core. May 14 23:42:21.544702 sshd[6084]: Connection closed by 147.75.109.163 port 50824 May 14 23:42:21.546287 sshd-session[6079]: pam_unix(sshd:session): session closed for user core May 14 23:42:21.553322 systemd[1]: sshd@16-139.178.70.107:22-147.75.109.163:50824.service: Deactivated successfully. May 14 23:42:21.554799 systemd[1]: session-19.scope: Deactivated successfully. May 14 23:42:21.555927 systemd-logind[1552]: Session 19 logged out. Waiting for processes to exit. May 14 23:42:21.556739 systemd-logind[1552]: Removed session 19. May 14 23:42:21.559623 systemd[1]: Started sshd@17-139.178.70.107:22-147.75.109.163:50832.service - OpenSSH per-connection server daemon (147.75.109.163:50832). May 14 23:42:21.655501 sshd[6094]: Accepted publickey for core from 147.75.109.163 port 50832 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:21.656362 sshd-session[6094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:21.661466 systemd-logind[1552]: New session 20 of user core. May 14 23:42:21.671427 systemd[1]: Started session-20.scope - Session 20 of User core. May 14 23:42:23.576481 sshd[6096]: Connection closed by 147.75.109.163 port 50832 May 14 23:42:23.578704 sshd-session[6094]: pam_unix(sshd:session): session closed for user core May 14 23:42:23.588287 systemd[1]: Started sshd@18-139.178.70.107:22-147.75.109.163:50838.service - OpenSSH per-connection server daemon (147.75.109.163:50838). May 14 23:42:23.589928 systemd[1]: sshd@17-139.178.70.107:22-147.75.109.163:50832.service: Deactivated successfully. May 14 23:42:23.591544 systemd[1]: session-20.scope: Deactivated successfully. May 14 23:42:23.591668 systemd[1]: session-20.scope: Consumed 412ms CPU time, 69.2M memory peak. May 14 23:42:23.595196 systemd-logind[1552]: Session 20 logged out. Waiting for processes to exit. May 14 23:42:23.596296 systemd-logind[1552]: Removed session 20. May 14 23:42:23.667209 sshd[6120]: Accepted publickey for core from 147.75.109.163 port 50838 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:23.670488 sshd-session[6120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:23.678860 systemd-logind[1552]: New session 21 of user core. May 14 23:42:23.684618 systemd[1]: Started session-21.scope - Session 21 of User core. May 14 23:42:25.368676 sshd[6128]: Connection closed by 147.75.109.163 port 50838 May 14 23:42:25.377756 systemd[1]: sshd@18-139.178.70.107:22-147.75.109.163:50838.service: Deactivated successfully. May 14 23:42:25.368949 sshd-session[6120]: pam_unix(sshd:session): session closed for user core May 14 23:42:25.379099 systemd[1]: session-21.scope: Deactivated successfully. May 14 23:42:25.380207 systemd-logind[1552]: Session 21 logged out. Waiting for processes to exit. May 14 23:42:25.381250 systemd[1]: Started sshd@19-139.178.70.107:22-147.75.109.163:50854.service - OpenSSH per-connection server daemon (147.75.109.163:50854). May 14 23:42:25.382238 systemd-logind[1552]: Removed session 21. May 14 23:42:25.838793 sshd[6137]: Accepted publickey for core from 147.75.109.163 port 50854 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:25.839206 sshd-session[6137]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:25.844389 systemd-logind[1552]: New session 22 of user core. May 14 23:42:25.847396 systemd[1]: Started session-22.scope - Session 22 of User core. May 14 23:42:26.009504 sshd[6140]: Connection closed by 147.75.109.163 port 50854 May 14 23:42:26.010863 sshd-session[6137]: pam_unix(sshd:session): session closed for user core May 14 23:42:26.014288 systemd[1]: sshd@19-139.178.70.107:22-147.75.109.163:50854.service: Deactivated successfully. May 14 23:42:26.016764 systemd[1]: session-22.scope: Deactivated successfully. May 14 23:42:26.018641 systemd-logind[1552]: Session 22 logged out. Waiting for processes to exit. May 14 23:42:26.019381 systemd-logind[1552]: Removed session 22. May 14 23:42:31.021604 systemd[1]: Started sshd@20-139.178.70.107:22-147.75.109.163:46358.service - OpenSSH per-connection server daemon (147.75.109.163:46358). May 14 23:42:31.135682 sshd[6155]: Accepted publickey for core from 147.75.109.163 port 46358 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:31.137024 sshd-session[6155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:31.140782 systemd-logind[1552]: New session 23 of user core. May 14 23:42:31.145476 systemd[1]: Started session-23.scope - Session 23 of User core. May 14 23:42:31.685197 sshd[6162]: Connection closed by 147.75.109.163 port 46358 May 14 23:42:31.686973 sshd-session[6155]: pam_unix(sshd:session): session closed for user core May 14 23:42:31.689863 systemd-logind[1552]: Session 23 logged out. Waiting for processes to exit. May 14 23:42:31.690351 systemd[1]: sshd@20-139.178.70.107:22-147.75.109.163:46358.service: Deactivated successfully. May 14 23:42:31.691954 systemd[1]: session-23.scope: Deactivated successfully. May 14 23:42:31.693158 systemd-logind[1552]: Removed session 23. May 14 23:42:33.699251 containerd[1574]: time="2025-05-14T23:42:33.699215244Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35\" id:\"63d309405401e0d0d5df35d822d4e9e9377c38516a6b50aa9e2cda51e64155ee\" pid:6186 exited_at:{seconds:1747266153 nanos:697132592}" May 14 23:42:33.860226 containerd[1574]: time="2025-05-14T23:42:33.860089376Z" level=info msg="StopPodSandbox for \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\"" May 14 23:42:33.893583 containerd[1574]: time="2025-05-14T23:42:33.893533483Z" level=info msg="TearDown network for sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" successfully" May 14 23:42:33.893583 containerd[1574]: time="2025-05-14T23:42:33.893576830Z" level=info msg="StopPodSandbox for \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" returns successfully" May 14 23:42:34.066988 containerd[1574]: time="2025-05-14T23:42:34.066955426Z" level=info msg="RemovePodSandbox for \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\"" May 14 23:42:34.070239 containerd[1574]: time="2025-05-14T23:42:34.070219941Z" level=info msg="Forcibly stopping sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\"" May 14 23:42:34.070342 containerd[1574]: time="2025-05-14T23:42:34.070328107Z" level=info msg="TearDown network for sandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" successfully" May 14 23:42:34.086291 containerd[1574]: time="2025-05-14T23:42:34.086130239Z" level=info msg="Ensure that sandbox 6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e in task-service has been cleanup successfully" May 14 23:42:34.098131 containerd[1574]: time="2025-05-14T23:42:34.097957803Z" level=info msg="RemovePodSandbox \"6c142dbf7c0a227172e5804f0eaa0d4995c74d8574d6a54456f020b33dbea57e\" returns successfully" May 14 23:42:35.778763 containerd[1574]: time="2025-05-14T23:42:35.778737039Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a1f649132aa7c33f7cbea7c72cc0f907828a0264eb832033fc520643314f7a35\" id:\"760b756e1bcee99995c90156155118e0c23dc7d0213ca6f734836f5cc69d47f3\" pid:6211 exited_at:{seconds:1747266155 nanos:778191255}" May 14 23:42:36.696436 systemd[1]: Started sshd@21-139.178.70.107:22-147.75.109.163:46372.service - OpenSSH per-connection server daemon (147.75.109.163:46372). May 14 23:42:36.743632 sshd[6221]: Accepted publickey for core from 147.75.109.163 port 46372 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:36.745006 sshd-session[6221]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:36.748006 systemd-logind[1552]: New session 24 of user core. May 14 23:42:36.762447 systemd[1]: Started session-24.scope - Session 24 of User core. May 14 23:42:37.321606 sshd[6223]: Connection closed by 147.75.109.163 port 46372 May 14 23:42:37.322016 sshd-session[6221]: pam_unix(sshd:session): session closed for user core May 14 23:42:37.323733 systemd[1]: sshd@21-139.178.70.107:22-147.75.109.163:46372.service: Deactivated successfully. May 14 23:42:37.325031 systemd[1]: session-24.scope: Deactivated successfully. May 14 23:42:37.325898 systemd-logind[1552]: Session 24 logged out. Waiting for processes to exit. May 14 23:42:37.326513 systemd-logind[1552]: Removed session 24. May 14 23:42:42.332710 systemd[1]: Started sshd@22-139.178.70.107:22-147.75.109.163:33208.service - OpenSSH per-connection server daemon (147.75.109.163:33208). May 14 23:42:42.391098 sshd[6240]: Accepted publickey for core from 147.75.109.163 port 33208 ssh2: RSA SHA256:dI/YmSL2eaJjRcDCjQQVyl715CqaDpo2Zw+lF5CHaNQ May 14 23:42:42.392019 sshd-session[6240]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 23:42:42.395982 systemd-logind[1552]: New session 25 of user core. May 14 23:42:42.399450 systemd[1]: Started session-25.scope - Session 25 of User core. May 14 23:42:42.634932 sshd[6242]: Connection closed by 147.75.109.163 port 33208 May 14 23:42:42.635706 sshd-session[6240]: pam_unix(sshd:session): session closed for user core May 14 23:42:42.637809 systemd[1]: sshd@22-139.178.70.107:22-147.75.109.163:33208.service: Deactivated successfully. May 14 23:42:42.638887 systemd[1]: session-25.scope: Deactivated successfully. May 14 23:42:42.640051 systemd-logind[1552]: Session 25 logged out. Waiting for processes to exit. May 14 23:42:42.640964 systemd-logind[1552]: Removed session 25.