Feb 13 19:19:43.736660 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 17:40:15 -00 2025 Feb 13 19:19:43.736675 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f28373bbaddf11103b551b595069cf5faacb27d62f1aab4f9911393ba418b416 Feb 13 19:19:43.736681 kernel: Disabled fast string operations Feb 13 19:19:43.736685 kernel: BIOS-provided physical RAM map: Feb 13 19:19:43.736689 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Feb 13 19:19:43.736693 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Feb 13 19:19:43.736699 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Feb 13 19:19:43.736709 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Feb 13 19:19:43.736713 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Feb 13 19:19:43.736717 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Feb 13 19:19:43.736721 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Feb 13 19:19:43.736725 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Feb 13 19:19:43.736729 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Feb 13 19:19:43.736733 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 19:19:43.736740 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Feb 13 19:19:43.736745 kernel: NX (Execute Disable) protection: active Feb 13 19:19:43.736749 kernel: APIC: Static calls initialized Feb 13 19:19:43.736754 kernel: SMBIOS 2.7 present. Feb 13 19:19:43.736758 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Feb 13 19:19:43.736763 kernel: vmware: hypercall mode: 0x00 Feb 13 19:19:43.736767 kernel: Hypervisor detected: VMware Feb 13 19:19:43.736772 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Feb 13 19:19:43.736777 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Feb 13 19:19:43.736782 kernel: vmware: using clock offset of 2366117433 ns Feb 13 19:19:43.736786 kernel: tsc: Detected 3408.000 MHz processor Feb 13 19:19:43.736791 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 19:19:43.736796 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 19:19:43.736801 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Feb 13 19:19:43.736805 kernel: total RAM covered: 3072M Feb 13 19:19:43.736810 kernel: Found optimal setting for mtrr clean up Feb 13 19:19:43.736817 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Feb 13 19:19:43.736822 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Feb 13 19:19:43.736827 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 19:19:43.736832 kernel: Using GB pages for direct mapping Feb 13 19:19:43.736836 kernel: ACPI: Early table checksum verification disabled Feb 13 19:19:43.736841 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Feb 13 19:19:43.736846 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Feb 13 19:19:43.736850 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Feb 13 19:19:43.736856 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Feb 13 19:19:43.736860 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 19:19:43.736867 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 19:19:43.736872 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Feb 13 19:19:43.736877 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Feb 13 19:19:43.736882 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Feb 13 19:19:43.736887 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Feb 13 19:19:43.736892 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Feb 13 19:19:43.736898 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Feb 13 19:19:43.736903 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Feb 13 19:19:43.736907 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Feb 13 19:19:43.736912 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 19:19:43.736917 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 19:19:43.736922 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Feb 13 19:19:43.736927 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Feb 13 19:19:43.736932 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Feb 13 19:19:43.736936 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Feb 13 19:19:43.736942 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Feb 13 19:19:43.736947 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Feb 13 19:19:43.736952 kernel: system APIC only can use physical flat Feb 13 19:19:43.736956 kernel: APIC: Switched APIC routing to: physical flat Feb 13 19:19:43.736961 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 19:19:43.736966 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Feb 13 19:19:43.736971 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Feb 13 19:19:43.736975 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Feb 13 19:19:43.736980 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Feb 13 19:19:43.736985 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Feb 13 19:19:43.736990 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Feb 13 19:19:43.736995 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Feb 13 19:19:43.737000 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Feb 13 19:19:43.737005 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Feb 13 19:19:43.737009 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Feb 13 19:19:43.737014 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Feb 13 19:19:43.737018 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Feb 13 19:19:43.737023 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Feb 13 19:19:43.737028 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Feb 13 19:19:43.737032 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Feb 13 19:19:43.737038 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Feb 13 19:19:43.737043 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Feb 13 19:19:43.737048 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Feb 13 19:19:43.737052 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Feb 13 19:19:43.737057 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Feb 13 19:19:43.737062 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Feb 13 19:19:43.737066 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Feb 13 19:19:43.737071 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Feb 13 19:19:43.737076 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Feb 13 19:19:43.737080 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Feb 13 19:19:43.737086 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Feb 13 19:19:43.737091 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Feb 13 19:19:43.737095 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Feb 13 19:19:43.737100 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Feb 13 19:19:43.737105 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Feb 13 19:19:43.737110 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Feb 13 19:19:43.737114 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Feb 13 19:19:43.737119 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Feb 13 19:19:43.737124 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Feb 13 19:19:43.737128 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Feb 13 19:19:43.737134 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Feb 13 19:19:43.737139 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Feb 13 19:19:43.737143 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Feb 13 19:19:43.737148 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Feb 13 19:19:43.737153 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Feb 13 19:19:43.737157 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Feb 13 19:19:43.737162 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Feb 13 19:19:43.737167 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Feb 13 19:19:43.737171 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Feb 13 19:19:43.737176 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Feb 13 19:19:43.737181 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Feb 13 19:19:43.737186 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Feb 13 19:19:43.737191 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Feb 13 19:19:43.737195 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Feb 13 19:19:43.737200 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Feb 13 19:19:43.737205 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Feb 13 19:19:43.737209 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Feb 13 19:19:43.737214 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Feb 13 19:19:43.737219 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Feb 13 19:19:43.737223 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Feb 13 19:19:43.737228 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Feb 13 19:19:43.737233 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Feb 13 19:19:43.737238 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Feb 13 19:19:43.737246 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Feb 13 19:19:43.737252 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Feb 13 19:19:43.737257 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Feb 13 19:19:43.737262 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Feb 13 19:19:43.737267 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Feb 13 19:19:43.737272 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Feb 13 19:19:43.737278 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Feb 13 19:19:43.737283 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Feb 13 19:19:43.737288 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Feb 13 19:19:43.737293 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Feb 13 19:19:43.737298 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Feb 13 19:19:43.737303 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Feb 13 19:19:43.737307 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Feb 13 19:19:43.737312 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Feb 13 19:19:43.737317 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Feb 13 19:19:43.737322 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Feb 13 19:19:43.737328 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Feb 13 19:19:43.737333 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Feb 13 19:19:43.737338 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Feb 13 19:19:43.737343 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Feb 13 19:19:43.737348 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Feb 13 19:19:43.737353 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Feb 13 19:19:43.737358 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Feb 13 19:19:43.737363 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Feb 13 19:19:43.737368 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Feb 13 19:19:43.737373 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Feb 13 19:19:43.737378 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Feb 13 19:19:43.737384 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Feb 13 19:19:43.737389 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Feb 13 19:19:43.737394 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Feb 13 19:19:43.737399 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Feb 13 19:19:43.737404 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Feb 13 19:19:43.737409 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Feb 13 19:19:43.737413 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Feb 13 19:19:43.737418 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Feb 13 19:19:43.737423 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Feb 13 19:19:43.737428 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Feb 13 19:19:43.737434 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Feb 13 19:19:43.737439 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Feb 13 19:19:43.737444 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Feb 13 19:19:43.737449 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Feb 13 19:19:43.737454 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Feb 13 19:19:43.737459 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Feb 13 19:19:43.737464 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Feb 13 19:19:43.737469 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Feb 13 19:19:43.737474 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Feb 13 19:19:43.737479 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Feb 13 19:19:43.737485 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Feb 13 19:19:43.737490 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Feb 13 19:19:43.737495 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Feb 13 19:19:43.737500 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Feb 13 19:19:43.737505 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Feb 13 19:19:43.737510 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Feb 13 19:19:43.737515 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Feb 13 19:19:43.737520 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Feb 13 19:19:43.737525 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Feb 13 19:19:43.737530 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Feb 13 19:19:43.737536 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Feb 13 19:19:43.737541 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Feb 13 19:19:43.737545 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Feb 13 19:19:43.737550 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Feb 13 19:19:43.737555 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Feb 13 19:19:43.737560 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Feb 13 19:19:43.737565 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Feb 13 19:19:43.737570 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Feb 13 19:19:43.737575 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Feb 13 19:19:43.737580 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Feb 13 19:19:43.737585 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Feb 13 19:19:43.737591 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Feb 13 19:19:43.737596 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 19:19:43.737601 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 19:19:43.737606 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Feb 13 19:19:43.737611 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Feb 13 19:19:43.737616 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Feb 13 19:19:43.737622 kernel: Zone ranges: Feb 13 19:19:43.737627 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 19:19:43.737632 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Feb 13 19:19:43.737638 kernel: Normal empty Feb 13 19:19:43.737643 kernel: Movable zone start for each node Feb 13 19:19:43.737648 kernel: Early memory node ranges Feb 13 19:19:43.737653 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Feb 13 19:19:43.737658 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Feb 13 19:19:43.737663 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Feb 13 19:19:43.737668 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Feb 13 19:19:43.737673 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 19:19:43.737679 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Feb 13 19:19:43.737685 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Feb 13 19:19:43.737690 kernel: ACPI: PM-Timer IO Port: 0x1008 Feb 13 19:19:43.737695 kernel: system APIC only can use physical flat Feb 13 19:19:43.737700 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Feb 13 19:19:43.737999 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 19:19:43.738005 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 19:19:43.738010 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 19:19:43.738015 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 19:19:43.738021 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 19:19:43.738026 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 19:19:43.738033 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 19:19:43.738038 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 19:19:43.738043 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 19:19:43.738048 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 19:19:43.738053 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 19:19:43.738058 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 19:19:43.738063 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 19:19:43.738068 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 19:19:43.738073 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 19:19:43.738078 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 19:19:43.738084 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Feb 13 19:19:43.738089 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Feb 13 19:19:43.738094 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Feb 13 19:19:43.738100 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Feb 13 19:19:43.738105 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Feb 13 19:19:43.738110 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Feb 13 19:19:43.738115 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Feb 13 19:19:43.738120 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Feb 13 19:19:43.738125 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Feb 13 19:19:43.738130 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Feb 13 19:19:43.738136 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Feb 13 19:19:43.738141 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Feb 13 19:19:43.738146 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Feb 13 19:19:43.738151 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Feb 13 19:19:43.738156 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Feb 13 19:19:43.738161 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Feb 13 19:19:43.738166 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Feb 13 19:19:43.738171 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Feb 13 19:19:43.738176 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Feb 13 19:19:43.738182 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Feb 13 19:19:43.738187 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Feb 13 19:19:43.738193 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Feb 13 19:19:43.738197 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Feb 13 19:19:43.738203 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Feb 13 19:19:43.738207 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Feb 13 19:19:43.738212 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Feb 13 19:19:43.738217 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Feb 13 19:19:43.738222 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Feb 13 19:19:43.738228 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Feb 13 19:19:43.738234 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Feb 13 19:19:43.738239 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Feb 13 19:19:43.738244 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Feb 13 19:19:43.738249 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Feb 13 19:19:43.738254 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Feb 13 19:19:43.738259 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Feb 13 19:19:43.738264 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Feb 13 19:19:43.738269 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Feb 13 19:19:43.738274 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Feb 13 19:19:43.738279 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Feb 13 19:19:43.738285 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Feb 13 19:19:43.738290 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Feb 13 19:19:43.738295 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Feb 13 19:19:43.738300 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Feb 13 19:19:43.738305 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Feb 13 19:19:43.738310 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Feb 13 19:19:43.738315 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Feb 13 19:19:43.738320 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Feb 13 19:19:43.738325 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Feb 13 19:19:43.738331 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Feb 13 19:19:43.738336 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Feb 13 19:19:43.738341 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Feb 13 19:19:43.738346 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Feb 13 19:19:43.738351 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Feb 13 19:19:43.738356 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Feb 13 19:19:43.738361 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Feb 13 19:19:43.738366 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Feb 13 19:19:43.738371 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Feb 13 19:19:43.738377 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Feb 13 19:19:43.738382 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Feb 13 19:19:43.738388 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Feb 13 19:19:43.738392 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Feb 13 19:19:43.738398 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Feb 13 19:19:43.738402 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Feb 13 19:19:43.738412 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Feb 13 19:19:43.738418 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Feb 13 19:19:43.738423 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Feb 13 19:19:43.738428 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Feb 13 19:19:43.738433 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Feb 13 19:19:43.738440 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Feb 13 19:19:43.738445 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Feb 13 19:19:43.738450 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Feb 13 19:19:43.738455 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Feb 13 19:19:43.738460 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Feb 13 19:19:43.738465 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Feb 13 19:19:43.738470 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Feb 13 19:19:43.738475 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Feb 13 19:19:43.738480 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Feb 13 19:19:43.738486 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Feb 13 19:19:43.738491 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Feb 13 19:19:43.738496 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Feb 13 19:19:43.738501 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Feb 13 19:19:43.738506 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Feb 13 19:19:43.738511 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Feb 13 19:19:43.738516 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Feb 13 19:19:43.738521 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Feb 13 19:19:43.738526 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Feb 13 19:19:43.738531 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Feb 13 19:19:43.738537 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Feb 13 19:19:43.738542 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Feb 13 19:19:43.738547 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Feb 13 19:19:43.738552 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Feb 13 19:19:43.738557 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Feb 13 19:19:43.738562 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Feb 13 19:19:43.738567 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Feb 13 19:19:43.738572 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Feb 13 19:19:43.738577 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Feb 13 19:19:43.738582 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Feb 13 19:19:43.738588 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Feb 13 19:19:43.738594 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Feb 13 19:19:43.738598 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Feb 13 19:19:43.738604 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Feb 13 19:19:43.738609 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Feb 13 19:19:43.738614 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Feb 13 19:19:43.738619 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Feb 13 19:19:43.738624 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Feb 13 19:19:43.738629 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Feb 13 19:19:43.738634 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Feb 13 19:19:43.738640 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Feb 13 19:19:43.738645 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Feb 13 19:19:43.738650 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Feb 13 19:19:43.738655 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Feb 13 19:19:43.738660 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Feb 13 19:19:43.738665 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Feb 13 19:19:43.738671 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 19:19:43.738676 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Feb 13 19:19:43.738681 kernel: TSC deadline timer available Feb 13 19:19:43.738687 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Feb 13 19:19:43.738692 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Feb 13 19:19:43.738697 kernel: Booting paravirtualized kernel on VMware hypervisor Feb 13 19:19:43.738712 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 19:19:43.738718 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Feb 13 19:19:43.738724 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Feb 13 19:19:43.738729 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Feb 13 19:19:43.738734 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Feb 13 19:19:43.738739 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Feb 13 19:19:43.738746 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Feb 13 19:19:43.738751 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Feb 13 19:19:43.738756 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Feb 13 19:19:43.738768 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Feb 13 19:19:43.738774 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Feb 13 19:19:43.738779 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Feb 13 19:19:43.738784 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Feb 13 19:19:43.738790 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Feb 13 19:19:43.738796 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Feb 13 19:19:43.738802 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Feb 13 19:19:43.738807 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Feb 13 19:19:43.738812 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Feb 13 19:19:43.738817 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Feb 13 19:19:43.738823 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Feb 13 19:19:43.738829 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f28373bbaddf11103b551b595069cf5faacb27d62f1aab4f9911393ba418b416 Feb 13 19:19:43.738835 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 19:19:43.738841 kernel: random: crng init done Feb 13 19:19:43.738846 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Feb 13 19:19:43.738852 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Feb 13 19:19:43.738857 kernel: printk: log_buf_len min size: 262144 bytes Feb 13 19:19:43.738863 kernel: printk: log_buf_len: 1048576 bytes Feb 13 19:19:43.738868 kernel: printk: early log buf free: 239648(91%) Feb 13 19:19:43.738874 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 19:19:43.738879 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 19:19:43.738884 kernel: Fallback order for Node 0: 0 Feb 13 19:19:43.738891 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Feb 13 19:19:43.738896 kernel: Policy zone: DMA32 Feb 13 19:19:43.738902 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 19:19:43.738909 kernel: Memory: 1934304K/2096628K available (14336K kernel code, 2301K rwdata, 22852K rodata, 43476K init, 1596K bss, 162064K reserved, 0K cma-reserved) Feb 13 19:19:43.738914 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Feb 13 19:19:43.738920 kernel: ftrace: allocating 37893 entries in 149 pages Feb 13 19:19:43.738927 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 19:19:43.738933 kernel: Dynamic Preempt: voluntary Feb 13 19:19:43.738938 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 19:19:43.738944 kernel: rcu: RCU event tracing is enabled. Feb 13 19:19:43.738950 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Feb 13 19:19:43.738955 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 19:19:43.738960 kernel: Rude variant of Tasks RCU enabled. Feb 13 19:19:43.738966 kernel: Tracing variant of Tasks RCU enabled. Feb 13 19:19:43.738971 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 19:19:43.738978 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Feb 13 19:19:43.738983 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Feb 13 19:19:43.738989 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Feb 13 19:19:43.738994 kernel: Console: colour VGA+ 80x25 Feb 13 19:19:43.739000 kernel: printk: console [tty0] enabled Feb 13 19:19:43.739005 kernel: printk: console [ttyS0] enabled Feb 13 19:19:43.739011 kernel: ACPI: Core revision 20230628 Feb 13 19:19:43.739016 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Feb 13 19:19:43.739022 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 19:19:43.739027 kernel: x2apic enabled Feb 13 19:19:43.739034 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 19:19:43.739039 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 19:19:43.739045 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 19:19:43.739050 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Feb 13 19:19:43.739056 kernel: Disabled fast string operations Feb 13 19:19:43.739064 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 19:19:43.739073 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 19:19:43.739079 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 19:19:43.739084 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Feb 13 19:19:43.739091 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Feb 13 19:19:43.739097 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Feb 13 19:19:43.739103 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 19:19:43.739108 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 19:19:43.739113 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 19:19:43.739119 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 19:19:43.739124 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 19:19:43.739130 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 19:19:43.739136 kernel: SRBDS: Unknown: Dependent on hypervisor status Feb 13 19:19:43.739142 kernel: GDS: Unknown: Dependent on hypervisor status Feb 13 19:19:43.739147 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 19:19:43.739153 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 19:19:43.739158 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 19:19:43.739164 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 19:19:43.739169 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Feb 13 19:19:43.739175 kernel: Freeing SMP alternatives memory: 32K Feb 13 19:19:43.739180 kernel: pid_max: default: 131072 minimum: 1024 Feb 13 19:19:43.739187 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 19:19:43.739192 kernel: landlock: Up and running. Feb 13 19:19:43.739197 kernel: SELinux: Initializing. Feb 13 19:19:43.739203 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:19:43.739208 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:19:43.739214 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 19:19:43.739220 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:19:43.739225 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:19:43.739230 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:19:43.739237 kernel: Performance Events: Skylake events, core PMU driver. Feb 13 19:19:43.739243 kernel: core: CPUID marked event: 'cpu cycles' unavailable Feb 13 19:19:43.739248 kernel: core: CPUID marked event: 'instructions' unavailable Feb 13 19:19:43.739254 kernel: core: CPUID marked event: 'bus cycles' unavailable Feb 13 19:19:43.739259 kernel: core: CPUID marked event: 'cache references' unavailable Feb 13 19:19:43.739264 kernel: core: CPUID marked event: 'cache misses' unavailable Feb 13 19:19:43.739269 kernel: core: CPUID marked event: 'branch instructions' unavailable Feb 13 19:19:43.739275 kernel: core: CPUID marked event: 'branch misses' unavailable Feb 13 19:19:43.739281 kernel: ... version: 1 Feb 13 19:19:43.739286 kernel: ... bit width: 48 Feb 13 19:19:43.739292 kernel: ... generic registers: 4 Feb 13 19:19:43.739297 kernel: ... value mask: 0000ffffffffffff Feb 13 19:19:43.739303 kernel: ... max period: 000000007fffffff Feb 13 19:19:43.739309 kernel: ... fixed-purpose events: 0 Feb 13 19:19:43.739314 kernel: ... event mask: 000000000000000f Feb 13 19:19:43.739320 kernel: signal: max sigframe size: 1776 Feb 13 19:19:43.739325 kernel: rcu: Hierarchical SRCU implementation. Feb 13 19:19:43.739331 kernel: rcu: Max phase no-delay instances is 400. Feb 13 19:19:43.739337 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 19:19:43.739343 kernel: smp: Bringing up secondary CPUs ... Feb 13 19:19:43.739348 kernel: smpboot: x86: Booting SMP configuration: Feb 13 19:19:43.739354 kernel: .... node #0, CPUs: #1 Feb 13 19:19:43.739359 kernel: Disabled fast string operations Feb 13 19:19:43.739364 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Feb 13 19:19:43.739370 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Feb 13 19:19:43.739375 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 19:19:43.739381 kernel: smpboot: Max logical packages: 128 Feb 13 19:19:43.739386 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Feb 13 19:19:43.739393 kernel: devtmpfs: initialized Feb 13 19:19:43.739398 kernel: x86/mm: Memory block size: 128MB Feb 13 19:19:43.739404 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Feb 13 19:19:43.739432 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 19:19:43.739438 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Feb 13 19:19:43.739444 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 19:19:43.739463 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 19:19:43.739468 kernel: audit: initializing netlink subsys (disabled) Feb 13 19:19:43.739474 kernel: audit: type=2000 audit(1739474382.065:1): state=initialized audit_enabled=0 res=1 Feb 13 19:19:43.739481 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 19:19:43.739486 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 19:19:43.739492 kernel: cpuidle: using governor menu Feb 13 19:19:43.739497 kernel: Simple Boot Flag at 0x36 set to 0x80 Feb 13 19:19:43.739502 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 19:19:43.739508 kernel: dca service started, version 1.12.1 Feb 13 19:19:43.739513 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Feb 13 19:19:43.739519 kernel: PCI: Using configuration type 1 for base access Feb 13 19:19:43.739525 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 19:19:43.739531 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 19:19:43.739536 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 19:19:43.739542 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 19:19:43.739547 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 19:19:43.739553 kernel: ACPI: Added _OSI(Module Device) Feb 13 19:19:43.739558 kernel: ACPI: Added _OSI(Processor Device) Feb 13 19:19:43.739563 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 19:19:43.739569 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 19:19:43.739576 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 19:19:43.739581 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Feb 13 19:19:43.739586 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 19:19:43.739592 kernel: ACPI: Interpreter enabled Feb 13 19:19:43.739597 kernel: ACPI: PM: (supports S0 S1 S5) Feb 13 19:19:43.739603 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 19:19:43.739608 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 19:19:43.739613 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 19:19:43.739619 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Feb 13 19:19:43.739625 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Feb 13 19:19:43.739725 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 19:19:43.739780 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Feb 13 19:19:43.739828 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Feb 13 19:19:43.739836 kernel: PCI host bridge to bus 0000:00 Feb 13 19:19:43.739884 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 19:19:43.739930 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Feb 13 19:19:43.739976 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 13 19:19:43.740018 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 19:19:43.740062 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Feb 13 19:19:43.740103 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Feb 13 19:19:43.740159 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Feb 13 19:19:43.740218 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Feb 13 19:19:43.740273 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Feb 13 19:19:43.740327 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Feb 13 19:19:43.740375 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Feb 13 19:19:43.740423 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 19:19:43.740471 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 19:19:43.740519 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 19:19:43.740567 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 19:19:43.740621 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Feb 13 19:19:43.740670 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Feb 13 19:19:43.740748 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Feb 13 19:19:43.740803 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Feb 13 19:19:43.742270 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Feb 13 19:19:43.742327 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Feb 13 19:19:43.742390 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Feb 13 19:19:43.742450 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Feb 13 19:19:43.742501 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Feb 13 19:19:43.742550 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Feb 13 19:19:43.742597 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Feb 13 19:19:43.742646 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 19:19:43.742698 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Feb 13 19:19:43.742771 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.742821 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.742876 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.742926 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.742978 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743028 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743084 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743134 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743186 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743235 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743287 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743337 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743392 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743447 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743499 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743550 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743602 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743652 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743721 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743773 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743827 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743876 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.743929 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.743978 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.744032 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.744081 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.744134 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.744183 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.744235 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.744283 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.744338 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.744387 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.744439 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.744488 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.744541 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.744590 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.744646 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.744695 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.746868 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.746939 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.746998 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747049 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747102 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747154 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747207 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747257 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747310 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747358 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747412 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747466 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747519 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747568 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747620 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747670 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747733 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747786 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747839 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747889 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.747942 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.747991 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.748044 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.748097 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.748150 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:19:43.748200 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.748251 kernel: pci_bus 0000:01: extended config space not accessible Feb 13 19:19:43.748302 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 19:19:43.748352 kernel: pci_bus 0000:02: extended config space not accessible Feb 13 19:19:43.748361 kernel: acpiphp: Slot [32] registered Feb 13 19:19:43.748369 kernel: acpiphp: Slot [33] registered Feb 13 19:19:43.748374 kernel: acpiphp: Slot [34] registered Feb 13 19:19:43.748380 kernel: acpiphp: Slot [35] registered Feb 13 19:19:43.748385 kernel: acpiphp: Slot [36] registered Feb 13 19:19:43.748390 kernel: acpiphp: Slot [37] registered Feb 13 19:19:43.748396 kernel: acpiphp: Slot [38] registered Feb 13 19:19:43.748401 kernel: acpiphp: Slot [39] registered Feb 13 19:19:43.748414 kernel: acpiphp: Slot [40] registered Feb 13 19:19:43.748420 kernel: acpiphp: Slot [41] registered Feb 13 19:19:43.748427 kernel: acpiphp: Slot [42] registered Feb 13 19:19:43.748433 kernel: acpiphp: Slot [43] registered Feb 13 19:19:43.748439 kernel: acpiphp: Slot [44] registered Feb 13 19:19:43.748444 kernel: acpiphp: Slot [45] registered Feb 13 19:19:43.748449 kernel: acpiphp: Slot [46] registered Feb 13 19:19:43.748455 kernel: acpiphp: Slot [47] registered Feb 13 19:19:43.748460 kernel: acpiphp: Slot [48] registered Feb 13 19:19:43.748466 kernel: acpiphp: Slot [49] registered Feb 13 19:19:43.748471 kernel: acpiphp: Slot [50] registered Feb 13 19:19:43.748478 kernel: acpiphp: Slot [51] registered Feb 13 19:19:43.748483 kernel: acpiphp: Slot [52] registered Feb 13 19:19:43.748488 kernel: acpiphp: Slot [53] registered Feb 13 19:19:43.748494 kernel: acpiphp: Slot [54] registered Feb 13 19:19:43.748499 kernel: acpiphp: Slot [55] registered Feb 13 19:19:43.748504 kernel: acpiphp: Slot [56] registered Feb 13 19:19:43.748510 kernel: acpiphp: Slot [57] registered Feb 13 19:19:43.748515 kernel: acpiphp: Slot [58] registered Feb 13 19:19:43.748520 kernel: acpiphp: Slot [59] registered Feb 13 19:19:43.748526 kernel: acpiphp: Slot [60] registered Feb 13 19:19:43.748532 kernel: acpiphp: Slot [61] registered Feb 13 19:19:43.748538 kernel: acpiphp: Slot [62] registered Feb 13 19:19:43.748543 kernel: acpiphp: Slot [63] registered Feb 13 19:19:43.748594 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Feb 13 19:19:43.748643 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 19:19:43.748691 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 19:19:43.749154 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:19:43.749207 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Feb 13 19:19:43.749259 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Feb 13 19:19:43.749308 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Feb 13 19:19:43.749357 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Feb 13 19:19:43.749410 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Feb 13 19:19:43.749468 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Feb 13 19:19:43.749518 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Feb 13 19:19:43.749569 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Feb 13 19:19:43.749621 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 19:19:43.749670 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 19:19:43.749731 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 19:19:43.749782 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 19:19:43.749832 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 19:19:43.749881 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 19:19:43.749931 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 19:19:43.749979 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 19:19:43.750031 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 19:19:43.750080 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:19:43.750130 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 19:19:43.751806 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 19:19:43.751858 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 19:19:43.751907 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:19:43.751957 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 19:19:43.752009 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 19:19:43.752057 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:19:43.752106 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 19:19:43.752155 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 19:19:43.752203 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:19:43.752255 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 19:19:43.752304 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 19:19:43.752353 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:19:43.752403 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 19:19:43.752456 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 19:19:43.752505 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:19:43.752555 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 19:19:43.752605 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 19:19:43.752657 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:19:43.756737 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Feb 13 19:19:43.756802 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Feb 13 19:19:43.756857 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Feb 13 19:19:43.756909 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Feb 13 19:19:43.756961 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Feb 13 19:19:43.757012 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 19:19:43.757067 kernel: pci 0000:0b:00.0: supports D1 D2 Feb 13 19:19:43.757118 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 19:19:43.757168 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 19:19:43.757220 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 19:19:43.757270 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 19:19:43.757319 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 19:19:43.757369 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 19:19:43.757419 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 19:19:43.757470 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 19:19:43.757519 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:19:43.757570 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 19:19:43.757619 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 19:19:43.757668 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 19:19:43.757773 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:19:43.757826 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 19:19:43.757878 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 19:19:43.757928 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:19:43.757978 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 19:19:43.758026 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 19:19:43.758074 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:19:43.758123 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 19:19:43.758172 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 19:19:43.758220 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:19:43.758271 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 19:19:43.758320 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 19:19:43.758369 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:19:43.758421 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 19:19:43.758470 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 19:19:43.758518 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:19:43.758568 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 19:19:43.758617 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 19:19:43.758668 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 19:19:43.759924 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:19:43.759980 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 19:19:43.760030 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 19:19:43.760079 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 19:19:43.760128 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:19:43.760178 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 19:19:43.760227 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 19:19:43.760278 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 19:19:43.760327 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:19:43.760377 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 19:19:43.760441 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 19:19:43.760502 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:19:43.760551 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 19:19:43.760600 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 19:19:43.760662 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:19:43.760743 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 19:19:43.760794 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 19:19:43.760842 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:19:43.760891 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 19:19:43.760940 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 19:19:43.760989 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:19:43.761037 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 19:19:43.761086 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 19:19:43.761138 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:19:43.761186 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 19:19:43.761235 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 19:19:43.761282 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 19:19:43.761331 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:19:43.761379 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 19:19:43.761447 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 19:19:43.761546 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 19:19:43.761594 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:19:43.761643 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 19:19:43.761691 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 19:19:43.761746 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:19:43.761796 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 19:19:43.761844 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 19:19:43.761892 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:19:43.761944 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 19:19:43.761992 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 19:19:43.762041 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:19:43.762090 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 19:19:43.762138 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 19:19:43.762186 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:19:43.762235 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 19:19:43.762285 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 19:19:43.762335 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:19:43.762384 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 19:19:43.762436 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 19:19:43.762485 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:19:43.762494 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Feb 13 19:19:43.762500 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Feb 13 19:19:43.762505 kernel: ACPI: PCI: Interrupt link LNKB disabled Feb 13 19:19:43.762511 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 19:19:43.762516 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Feb 13 19:19:43.762524 kernel: iommu: Default domain type: Translated Feb 13 19:19:43.762530 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 19:19:43.762535 kernel: PCI: Using ACPI for IRQ routing Feb 13 19:19:43.762540 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 19:19:43.762546 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Feb 13 19:19:43.762552 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Feb 13 19:19:43.762600 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Feb 13 19:19:43.762677 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Feb 13 19:19:43.762733 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 19:19:43.762743 kernel: vgaarb: loaded Feb 13 19:19:43.762749 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Feb 13 19:19:43.762755 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Feb 13 19:19:43.762760 kernel: clocksource: Switched to clocksource tsc-early Feb 13 19:19:43.762766 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 19:19:43.762771 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 19:19:43.762777 kernel: pnp: PnP ACPI init Feb 13 19:19:43.762827 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Feb 13 19:19:43.762876 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Feb 13 19:19:43.762920 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Feb 13 19:19:43.762968 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Feb 13 19:19:43.763016 kernel: pnp 00:06: [dma 2] Feb 13 19:19:43.763066 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Feb 13 19:19:43.763111 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Feb 13 19:19:43.763157 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Feb 13 19:19:43.763165 kernel: pnp: PnP ACPI: found 8 devices Feb 13 19:19:43.763171 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 19:19:43.763181 kernel: NET: Registered PF_INET protocol family Feb 13 19:19:43.763187 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 19:19:43.763201 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 19:19:43.763225 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 19:19:43.763250 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 19:19:43.763274 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 19:19:43.763306 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 19:19:43.763330 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:19:43.763355 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:19:43.763379 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 19:19:43.763401 kernel: NET: Registered PF_XDP protocol family Feb 13 19:19:43.763582 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Feb 13 19:19:43.763637 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Feb 13 19:19:43.763700 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Feb 13 19:19:43.763806 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Feb 13 19:19:43.763856 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Feb 13 19:19:43.763906 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Feb 13 19:19:43.763956 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Feb 13 19:19:43.764004 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Feb 13 19:19:43.764056 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Feb 13 19:19:43.764105 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Feb 13 19:19:43.764154 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Feb 13 19:19:43.764203 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Feb 13 19:19:43.764251 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Feb 13 19:19:43.764299 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Feb 13 19:19:43.764350 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Feb 13 19:19:43.764399 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Feb 13 19:19:43.764447 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Feb 13 19:19:43.764495 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Feb 13 19:19:43.764544 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Feb 13 19:19:43.764592 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Feb 13 19:19:43.764643 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Feb 13 19:19:43.764691 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Feb 13 19:19:43.764767 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Feb 13 19:19:43.766310 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:19:43.766365 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:19:43.766431 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.766494 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.766545 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.766593 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.766642 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.766690 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.766752 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.766801 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.766850 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.766899 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.766951 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767000 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767048 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767096 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767145 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767193 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767242 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767291 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767342 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767390 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767474 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767554 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767602 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767651 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767699 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767783 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767836 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767896 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.767945 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.767994 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768042 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768090 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768138 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768187 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768238 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768286 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768334 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768384 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768432 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768480 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768529 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768578 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768630 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768697 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768773 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768823 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768872 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.768921 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.768970 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769019 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769068 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769119 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769168 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769217 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769267 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769315 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769364 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769434 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769497 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769546 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769594 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769646 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769695 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769751 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769800 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769849 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769897 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.769946 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.769993 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770042 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770093 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770141 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770190 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770237 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770286 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770334 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770383 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770431 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770479 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770527 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770578 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770627 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770676 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 19:19:43.770731 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:19:43.770780 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 19:19:43.770830 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Feb 13 19:19:43.770878 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 19:19:43.770926 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 19:19:43.770975 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:19:43.771028 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Feb 13 19:19:43.771078 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 19:19:43.771127 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 19:19:43.771175 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 19:19:43.771224 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:19:43.771273 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 19:19:43.771322 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 19:19:43.771370 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 19:19:43.771443 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:19:43.771545 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 19:19:43.771594 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 19:19:43.771642 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 19:19:43.771692 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:19:43.771747 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 19:19:43.771796 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 19:19:43.771845 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:19:43.771893 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 19:19:43.771941 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 19:19:43.771992 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:19:43.772042 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 19:19:43.772091 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 19:19:43.772139 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:19:43.772187 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 19:19:43.772235 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 19:19:43.772286 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:19:43.772334 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 19:19:43.772383 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 19:19:43.772437 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:19:43.772489 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Feb 13 19:19:43.772538 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 19:19:43.772586 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 19:19:43.772635 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 19:19:43.772684 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:19:43.772751 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 19:19:43.772801 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 19:19:43.772850 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 19:19:43.772899 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:19:43.772948 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 19:19:43.772997 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 19:19:43.773046 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 19:19:43.773095 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:19:43.773143 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 19:19:43.773193 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 19:19:43.773242 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:19:43.773291 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 19:19:43.773339 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 19:19:43.773388 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:19:43.773436 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 19:19:43.773485 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 19:19:43.773534 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:19:43.773583 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 19:19:43.773631 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 19:19:43.773683 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:19:43.773768 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 19:19:43.773818 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 19:19:43.773867 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:19:43.773915 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 19:19:43.773963 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 19:19:43.774012 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 19:19:43.774059 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:19:43.774107 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 19:19:43.774158 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 19:19:43.774206 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 19:19:43.774255 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:19:43.774304 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 19:19:43.774361 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 19:19:43.774409 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 19:19:43.774457 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:19:43.774505 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 19:19:43.774554 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 19:19:43.774603 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:19:43.774654 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 19:19:43.774807 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 19:19:43.774859 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:19:43.774909 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 19:19:43.774957 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 19:19:43.775006 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:19:43.775055 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 19:19:43.775103 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 19:19:43.775152 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:19:43.775203 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 19:19:43.775250 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 19:19:43.775299 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:19:43.775348 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 19:19:43.775396 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 19:19:43.775456 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 19:19:43.775507 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:19:43.775556 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 19:19:43.775605 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 19:19:43.775654 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 19:19:43.775712 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:19:43.775762 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 19:19:43.775811 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 19:19:43.775859 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:19:43.775908 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 19:19:43.775957 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 19:19:43.776006 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:19:43.776054 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 19:19:43.776103 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 19:19:43.776154 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:19:43.776203 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 19:19:43.776251 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 19:19:43.776299 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:19:43.776347 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 19:19:43.776396 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 19:19:43.776444 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:19:43.776493 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 19:19:43.776541 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 19:19:43.776589 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:19:43.776638 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 19:19:43.776682 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 19:19:43.776751 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 19:19:43.776795 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Feb 13 19:19:43.776837 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Feb 13 19:19:43.776883 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Feb 13 19:19:43.776928 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Feb 13 19:19:43.776975 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:19:43.777019 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 19:19:43.777064 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 19:19:43.777109 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 19:19:43.777153 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Feb 13 19:19:43.777197 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Feb 13 19:19:43.777247 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Feb 13 19:19:43.777295 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Feb 13 19:19:43.777339 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:19:43.777390 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Feb 13 19:19:43.777443 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Feb 13 19:19:43.777494 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:19:43.777542 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Feb 13 19:19:43.777587 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Feb 13 19:19:43.777635 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:19:43.777682 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Feb 13 19:19:43.777740 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:19:43.777791 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Feb 13 19:19:43.777837 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:19:43.777885 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Feb 13 19:19:43.777930 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:19:43.777981 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Feb 13 19:19:43.778027 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:19:43.778078 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Feb 13 19:19:43.778132 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:19:43.778183 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Feb 13 19:19:43.778232 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Feb 13 19:19:43.778277 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:19:43.778328 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Feb 13 19:19:43.778373 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Feb 13 19:19:43.778427 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:19:43.778477 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Feb 13 19:19:43.778524 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Feb 13 19:19:43.778572 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:19:43.778621 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Feb 13 19:19:43.778667 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:19:43.779738 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Feb 13 19:19:43.779794 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:19:43.779847 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Feb 13 19:19:43.779896 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:19:43.779945 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Feb 13 19:19:43.779990 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:19:43.780039 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Feb 13 19:19:43.780084 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:19:43.780132 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Feb 13 19:19:43.780180 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Feb 13 19:19:43.780224 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:19:43.780272 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Feb 13 19:19:43.780317 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Feb 13 19:19:43.780362 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:19:43.780412 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Feb 13 19:19:43.780457 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Feb 13 19:19:43.780503 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:19:43.780552 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Feb 13 19:19:43.780597 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:19:43.780644 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Feb 13 19:19:43.780689 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:19:43.780751 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Feb 13 19:19:43.780796 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:19:43.780848 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Feb 13 19:19:43.780893 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:19:43.780942 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Feb 13 19:19:43.780988 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:19:43.781040 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Feb 13 19:19:43.781088 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Feb 13 19:19:43.781133 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:19:43.781181 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Feb 13 19:19:43.781227 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Feb 13 19:19:43.781272 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:19:43.781323 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Feb 13 19:19:43.781368 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:19:43.781442 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Feb 13 19:19:43.781534 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:19:43.781583 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Feb 13 19:19:43.781628 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:19:43.781676 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Feb 13 19:19:43.782042 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:19:43.782100 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Feb 13 19:19:43.782147 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:19:43.782196 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Feb 13 19:19:43.782242 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:19:43.782295 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 19:19:43.782304 kernel: PCI: CLS 32 bytes, default 64 Feb 13 19:19:43.782311 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 19:19:43.782319 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 19:19:43.782325 kernel: clocksource: Switched to clocksource tsc Feb 13 19:19:43.782331 kernel: Initialise system trusted keyrings Feb 13 19:19:43.782338 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 19:19:43.782344 kernel: Key type asymmetric registered Feb 13 19:19:43.782349 kernel: Asymmetric key parser 'x509' registered Feb 13 19:19:43.782356 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 19:19:43.782361 kernel: io scheduler mq-deadline registered Feb 13 19:19:43.782367 kernel: io scheduler kyber registered Feb 13 19:19:43.782374 kernel: io scheduler bfq registered Feb 13 19:19:43.782808 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Feb 13 19:19:43.782869 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783204 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Feb 13 19:19:43.783268 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783322 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Feb 13 19:19:43.783374 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783425 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Feb 13 19:19:43.783478 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783527 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Feb 13 19:19:43.783576 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783626 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Feb 13 19:19:43.783675 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783734 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Feb 13 19:19:43.783785 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783834 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Feb 13 19:19:43.783883 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.783932 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Feb 13 19:19:43.784001 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.784064 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Feb 13 19:19:43.784113 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.784163 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Feb 13 19:19:43.784212 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.784261 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Feb 13 19:19:43.784311 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.784362 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Feb 13 19:19:43.784411 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.784460 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Feb 13 19:19:43.784510 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.784560 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Feb 13 19:19:43.784612 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.784661 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Feb 13 19:19:43.785028 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785090 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Feb 13 19:19:43.785143 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785195 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Feb 13 19:19:43.785245 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785304 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Feb 13 19:19:43.785369 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785429 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Feb 13 19:19:43.785481 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785532 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Feb 13 19:19:43.785581 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785635 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Feb 13 19:19:43.785686 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785744 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Feb 13 19:19:43.785795 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785845 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Feb 13 19:19:43.785898 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.785948 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Feb 13 19:19:43.785998 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.786048 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Feb 13 19:19:43.786098 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.786149 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Feb 13 19:19:43.786201 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.786259 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Feb 13 19:19:43.786309 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.786358 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Feb 13 19:19:43.786407 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.786457 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Feb 13 19:19:43.786523 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.786590 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Feb 13 19:19:43.786654 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.789519 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Feb 13 19:19:43.789590 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:19:43.789603 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 19:19:43.789609 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 19:19:43.789616 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 19:19:43.789622 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Feb 13 19:19:43.789628 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 19:19:43.789633 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 19:19:43.789685 kernel: rtc_cmos 00:01: registered as rtc0 Feb 13 19:19:43.789742 kernel: rtc_cmos 00:01: setting system clock to 2025-02-13T19:19:43 UTC (1739474383) Feb 13 19:19:43.789791 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Feb 13 19:19:43.789799 kernel: intel_pstate: CPU model not supported Feb 13 19:19:43.789805 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 19:19:43.789811 kernel: NET: Registered PF_INET6 protocol family Feb 13 19:19:43.789817 kernel: Segment Routing with IPv6 Feb 13 19:19:43.789823 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 19:19:43.789829 kernel: NET: Registered PF_PACKET protocol family Feb 13 19:19:43.789835 kernel: Key type dns_resolver registered Feb 13 19:19:43.789843 kernel: IPI shorthand broadcast: enabled Feb 13 19:19:43.789849 kernel: sched_clock: Marking stable (843399034, 215013085)->(1107291315, -48879196) Feb 13 19:19:43.789854 kernel: registered taskstats version 1 Feb 13 19:19:43.789860 kernel: Loading compiled-in X.509 certificates Feb 13 19:19:43.789866 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: 6c364ddae48101e091a28279a8d953535f596d53' Feb 13 19:19:43.789872 kernel: Key type .fscrypt registered Feb 13 19:19:43.789878 kernel: Key type fscrypt-provisioning registered Feb 13 19:19:43.789884 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 19:19:43.789890 kernel: ima: Allocated hash algorithm: sha1 Feb 13 19:19:43.789897 kernel: ima: No architecture policies found Feb 13 19:19:43.789903 kernel: clk: Disabling unused clocks Feb 13 19:19:43.789909 kernel: Freeing unused kernel image (initmem) memory: 43476K Feb 13 19:19:43.789915 kernel: Write protecting the kernel read-only data: 38912k Feb 13 19:19:43.789920 kernel: Freeing unused kernel image (rodata/data gap) memory: 1724K Feb 13 19:19:43.789926 kernel: Run /init as init process Feb 13 19:19:43.789932 kernel: with arguments: Feb 13 19:19:43.789938 kernel: /init Feb 13 19:19:43.789944 kernel: with environment: Feb 13 19:19:43.789950 kernel: HOME=/ Feb 13 19:19:43.789956 kernel: TERM=linux Feb 13 19:19:43.789961 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 19:19:43.789968 systemd[1]: Successfully made /usr/ read-only. Feb 13 19:19:43.789977 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 19:19:43.789983 systemd[1]: Detected virtualization vmware. Feb 13 19:19:43.789989 systemd[1]: Detected architecture x86-64. Feb 13 19:19:43.789995 systemd[1]: Running in initrd. Feb 13 19:19:43.790003 systemd[1]: No hostname configured, using default hostname. Feb 13 19:19:43.790009 systemd[1]: Hostname set to . Feb 13 19:19:43.790015 systemd[1]: Initializing machine ID from random generator. Feb 13 19:19:43.790021 systemd[1]: Queued start job for default target initrd.target. Feb 13 19:19:43.790027 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:19:43.790033 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:19:43.790040 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 19:19:43.790047 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 19:19:43.790054 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 19:19:43.790061 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 19:19:43.790068 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 19:19:43.790074 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 19:19:43.790081 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:19:43.790087 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:19:43.790093 systemd[1]: Reached target paths.target - Path Units. Feb 13 19:19:43.790101 systemd[1]: Reached target slices.target - Slice Units. Feb 13 19:19:43.790107 systemd[1]: Reached target swap.target - Swaps. Feb 13 19:19:43.790113 systemd[1]: Reached target timers.target - Timer Units. Feb 13 19:19:43.790119 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:19:43.790125 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:19:43.790131 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 19:19:43.790137 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Feb 13 19:19:43.790143 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:19:43.790361 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 19:19:43.790368 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:19:43.790375 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 19:19:43.790381 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 19:19:43.790387 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 19:19:43.790393 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 19:19:43.790399 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 19:19:43.790406 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 19:19:43.790412 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 19:19:43.790420 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:19:43.790439 systemd-journald[217]: Collecting audit messages is disabled. Feb 13 19:19:43.790455 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 19:19:43.790462 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:19:43.790470 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 19:19:43.790476 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 19:19:43.790483 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 19:19:43.790489 kernel: Bridge firewalling registered Feb 13 19:19:43.790496 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 19:19:43.790503 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 19:19:43.790509 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:19:43.790516 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:19:43.790522 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 19:19:43.790528 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 19:19:43.790534 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:19:43.790541 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:19:43.790548 systemd-journald[217]: Journal started Feb 13 19:19:43.790564 systemd-journald[217]: Runtime Journal (/run/log/journal/e4bfc89522eb492292a519409f544e61) is 4.8M, max 38.6M, 33.8M free. Feb 13 19:19:43.748458 systemd-modules-load[218]: Inserted module 'overlay' Feb 13 19:19:43.766317 systemd-modules-load[218]: Inserted module 'br_netfilter' Feb 13 19:19:43.792780 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 19:19:43.794140 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:19:43.798788 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 19:19:43.800366 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 19:19:43.805630 dracut-cmdline[247]: dracut-dracut-053 Feb 13 19:19:43.804868 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:19:43.805981 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 19:19:43.808350 dracut-cmdline[247]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f28373bbaddf11103b551b595069cf5faacb27d62f1aab4f9911393ba418b416 Feb 13 19:19:43.828406 systemd-resolved[261]: Positive Trust Anchors: Feb 13 19:19:43.828412 systemd-resolved[261]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 19:19:43.828432 systemd-resolved[261]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 19:19:43.830760 systemd-resolved[261]: Defaulting to hostname 'linux'. Feb 13 19:19:43.831404 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 19:19:43.831736 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:19:43.853716 kernel: SCSI subsystem initialized Feb 13 19:19:43.859713 kernel: Loading iSCSI transport class v2.0-870. Feb 13 19:19:43.865713 kernel: iscsi: registered transport (tcp) Feb 13 19:19:43.877917 kernel: iscsi: registered transport (qla4xxx) Feb 13 19:19:43.877937 kernel: QLogic iSCSI HBA Driver Feb 13 19:19:43.896587 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 19:19:43.900842 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 19:19:43.915359 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 19:19:43.915382 kernel: device-mapper: uevent: version 1.0.3 Feb 13 19:19:43.915390 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 19:19:43.945744 kernel: raid6: avx2x4 gen() 49661 MB/s Feb 13 19:19:43.962746 kernel: raid6: avx2x2 gen() 55557 MB/s Feb 13 19:19:43.979894 kernel: raid6: avx2x1 gen() 46674 MB/s Feb 13 19:19:43.979911 kernel: raid6: using algorithm avx2x2 gen() 55557 MB/s Feb 13 19:19:43.997903 kernel: raid6: .... xor() 33415 MB/s, rmw enabled Feb 13 19:19:43.997927 kernel: raid6: using avx2x2 recovery algorithm Feb 13 19:19:44.010717 kernel: xor: automatically using best checksumming function avx Feb 13 19:19:44.094718 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 19:19:44.100175 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:19:44.105793 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:19:44.113225 systemd-udevd[435]: Using default interface naming scheme 'v255'. Feb 13 19:19:44.115981 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:19:44.129225 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 19:19:44.137265 dracut-pre-trigger[441]: rd.md=0: removing MD RAID activation Feb 13 19:19:44.152811 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:19:44.156949 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 19:19:44.223150 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:19:44.225826 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 19:19:44.234079 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 19:19:44.235671 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:19:44.235792 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:19:44.235897 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 19:19:44.241791 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 19:19:44.247296 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:19:44.286715 kernel: VMware PVSCSI driver - version 1.0.7.0-k Feb 13 19:19:44.288067 kernel: vmw_pvscsi: using 64bit dma Feb 13 19:19:44.288088 kernel: vmw_pvscsi: max_id: 16 Feb 13 19:19:44.288096 kernel: vmw_pvscsi: setting ring_pages to 8 Feb 13 19:19:44.294568 kernel: vmw_pvscsi: enabling reqCallThreshold Feb 13 19:19:44.294585 kernel: vmw_pvscsi: driver-based request coalescing enabled Feb 13 19:19:44.294594 kernel: vmw_pvscsi: using MSI-X Feb 13 19:19:44.299097 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Feb 13 19:19:44.299118 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Feb 13 19:19:44.299275 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Feb 13 19:19:44.299361 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Feb 13 19:19:44.303712 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Feb 13 19:19:44.307375 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Feb 13 19:19:44.307398 kernel: libata version 3.00 loaded. Feb 13 19:19:44.310715 kernel: ata_piix 0000:00:07.1: version 2.13 Feb 13 19:19:44.315551 kernel: scsi host1: ata_piix Feb 13 19:19:44.315625 kernel: scsi host2: ata_piix Feb 13 19:19:44.315697 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Feb 13 19:19:44.315750 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Feb 13 19:19:44.321712 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 19:19:44.327279 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:19:44.327512 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:19:44.327848 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:19:44.328101 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:19:44.328182 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:19:44.329180 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Feb 13 19:19:44.329343 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:19:44.332829 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:19:44.344582 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:19:44.348818 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:19:44.361820 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:19:44.485772 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Feb 13 19:19:44.491734 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Feb 13 19:19:44.501882 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 19:19:44.501917 kernel: AES CTR mode by8 optimization enabled Feb 13 19:19:44.515068 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Feb 13 19:19:44.521638 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 19:19:44.521728 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Feb 13 19:19:44.521793 kernel: sd 0:0:0:0: [sda] Cache data unavailable Feb 13 19:19:44.521853 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Feb 13 19:19:44.521913 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:19:44.521922 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 19:19:44.521979 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Feb 13 19:19:44.532423 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 13 19:19:44.532438 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Feb 13 19:19:44.560719 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (482) Feb 13 19:19:44.565723 kernel: BTRFS: device fsid 60f89c25-9096-4268-99ca-ef7992742f2b devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (488) Feb 13 19:19:44.570378 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Feb 13 19:19:44.575746 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Feb 13 19:19:44.581077 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 19:19:44.585394 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Feb 13 19:19:44.585651 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Feb 13 19:19:44.589783 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 19:19:44.612720 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:19:44.616718 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:19:45.618755 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:19:45.618939 disk-uuid[595]: The operation has completed successfully. Feb 13 19:19:45.651610 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 19:19:45.651678 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 19:19:45.676907 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 19:19:45.678650 sh[611]: Success Feb 13 19:19:45.686715 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 19:19:45.734363 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 19:19:45.741161 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 19:19:45.741547 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 19:19:45.756307 kernel: BTRFS info (device dm-0): first mount of filesystem 60f89c25-9096-4268-99ca-ef7992742f2b Feb 13 19:19:45.756328 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:19:45.756337 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 19:19:45.758198 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 19:19:45.758212 kernel: BTRFS info (device dm-0): using free space tree Feb 13 19:19:45.765721 kernel: BTRFS info (device dm-0): enabling ssd optimizations Feb 13 19:19:45.767549 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 19:19:45.776822 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Feb 13 19:19:45.778009 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 19:19:45.802758 kernel: BTRFS info (device sda6): first mount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 19:19:45.802787 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:19:45.802801 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:19:45.808813 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:19:45.817717 kernel: BTRFS info (device sda6): last unmount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 19:19:45.816746 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 19:19:45.820623 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 19:19:45.823793 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 19:19:45.844284 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 19:19:45.850785 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 19:19:45.892355 ignition[674]: Ignition 2.20.0 Feb 13 19:19:45.892618 ignition[674]: Stage: fetch-offline Feb 13 19:19:45.892753 ignition[674]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:19:45.892846 ignition[674]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:19:45.892906 ignition[674]: parsed url from cmdline: "" Feb 13 19:19:45.892908 ignition[674]: no config URL provided Feb 13 19:19:45.892911 ignition[674]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 19:19:45.892916 ignition[674]: no config at "/usr/lib/ignition/user.ign" Feb 13 19:19:45.893260 ignition[674]: config successfully fetched Feb 13 19:19:45.893276 ignition[674]: parsing config with SHA512: e059d80eec1f3c67f69a8cac505d725855a30a936866a29b49c51c7fddb624e90f687b455929daa0e243bbeb36c5f5518616a3b2f8f478da51dba7674fb44169 Feb 13 19:19:45.896349 unknown[674]: fetched base config from "system" Feb 13 19:19:45.896545 unknown[674]: fetched user config from "vmware" Feb 13 19:19:45.896848 ignition[674]: fetch-offline: fetch-offline passed Feb 13 19:19:45.896890 ignition[674]: Ignition finished successfully Feb 13 19:19:45.897835 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:19:45.910008 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:19:45.913799 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 19:19:45.927043 systemd-networkd[807]: lo: Link UP Feb 13 19:19:45.927049 systemd-networkd[807]: lo: Gained carrier Feb 13 19:19:45.927810 systemd-networkd[807]: Enumeration completed Feb 13 19:19:45.927955 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 19:19:45.928040 systemd-networkd[807]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Feb 13 19:19:45.928095 systemd[1]: Reached target network.target - Network. Feb 13 19:19:45.928187 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 19:19:45.931668 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 19:19:45.931791 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 19:19:45.931491 systemd-networkd[807]: ens192: Link UP Feb 13 19:19:45.931493 systemd-networkd[807]: ens192: Gained carrier Feb 13 19:19:45.937931 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 19:19:45.945799 ignition[810]: Ignition 2.20.0 Feb 13 19:19:45.945807 ignition[810]: Stage: kargs Feb 13 19:19:45.945913 ignition[810]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:19:45.945919 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:19:45.946562 ignition[810]: kargs: kargs passed Feb 13 19:19:45.946592 ignition[810]: Ignition finished successfully Feb 13 19:19:45.947749 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 19:19:45.956904 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 19:19:45.963252 ignition[817]: Ignition 2.20.0 Feb 13 19:19:45.963258 ignition[817]: Stage: disks Feb 13 19:19:45.963352 ignition[817]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:19:45.963358 ignition[817]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:19:45.963899 ignition[817]: disks: disks passed Feb 13 19:19:45.963925 ignition[817]: Ignition finished successfully Feb 13 19:19:45.964754 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 19:19:45.964909 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 19:19:45.965011 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 19:19:45.965115 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 19:19:45.965207 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 19:19:45.965298 systemd[1]: Reached target basic.target - Basic System. Feb 13 19:19:45.968949 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 19:19:45.979301 systemd-fsck[825]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Feb 13 19:19:45.980401 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 19:19:46.763763 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 19:19:46.819717 kernel: EXT4-fs (sda9): mounted filesystem 157595f2-1515-4117-a2d1-73fe2ed647fc r/w with ordered data mode. Quota mode: none. Feb 13 19:19:46.820106 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 19:19:46.820432 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 19:19:46.828769 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:19:46.830190 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 19:19:46.830656 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 19:19:46.830693 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 19:19:46.830832 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:19:46.835373 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 19:19:46.837878 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 19:19:46.840015 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (833) Feb 13 19:19:46.840037 kernel: BTRFS info (device sda6): first mount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 19:19:46.841487 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:19:46.841509 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:19:46.846133 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:19:46.846509 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:19:46.865754 initrd-setup-root[857]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 19:19:46.868127 initrd-setup-root[864]: cut: /sysroot/etc/group: No such file or directory Feb 13 19:19:46.870217 initrd-setup-root[871]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 19:19:46.872067 initrd-setup-root[878]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 19:19:46.925280 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 19:19:46.929768 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 19:19:46.931190 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 19:19:46.936715 kernel: BTRFS info (device sda6): last unmount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 19:19:46.949642 ignition[945]: INFO : Ignition 2.20.0 Feb 13 19:19:46.949642 ignition[945]: INFO : Stage: mount Feb 13 19:19:46.949642 ignition[945]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:19:46.949642 ignition[945]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:19:46.949642 ignition[945]: INFO : mount: mount passed Feb 13 19:19:46.949642 ignition[945]: INFO : Ignition finished successfully Feb 13 19:19:46.950652 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 19:19:46.954783 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 19:19:46.955190 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 19:19:47.755129 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 19:19:47.759840 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:19:47.769345 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (958) Feb 13 19:19:47.772177 kernel: BTRFS info (device sda6): first mount of filesystem 9d862461-eab1-477f-8790-b61f63b2958e Feb 13 19:19:47.772198 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:19:47.772209 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:19:47.776720 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:19:47.777403 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:19:47.800881 ignition[975]: INFO : Ignition 2.20.0 Feb 13 19:19:47.800881 ignition[975]: INFO : Stage: files Feb 13 19:19:47.801204 ignition[975]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:19:47.801204 ignition[975]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:19:47.801447 ignition[975]: DEBUG : files: compiled without relabeling support, skipping Feb 13 19:19:47.802115 ignition[975]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 19:19:47.802115 ignition[975]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 19:19:47.803833 ignition[975]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 19:19:47.803975 ignition[975]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 19:19:47.804115 ignition[975]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 19:19:47.804085 unknown[975]: wrote ssh authorized keys file for user: core Feb 13 19:19:47.805416 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Feb 13 19:19:47.805733 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Feb 13 19:19:47.849499 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 19:19:47.910916 systemd-networkd[807]: ens192: Gained IPv6LL Feb 13 19:19:47.970430 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Feb 13 19:19:47.970770 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Feb 13 19:19:47.970770 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 19:19:47.970770 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 19:19:47.970770 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 19:19:47.970770 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 19:19:47.970770 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 19:19:47.970770 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 19:19:47.972110 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 19:19:47.972110 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:19:47.972110 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:19:47.972110 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 19:19:47.972110 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 19:19:47.972110 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 19:19:47.972110 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 Feb 13 19:19:48.524546 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Feb 13 19:19:48.665097 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 19:19:48.665397 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 19:19:48.665397 ignition[975]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 19:19:48.665397 ignition[975]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Feb 13 19:19:48.665989 ignition[975]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Feb 13 19:19:48.687298 ignition[975]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 19:19:48.689185 ignition[975]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 19:19:48.689488 ignition[975]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Feb 13 19:19:48.689488 ignition[975]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Feb 13 19:19:48.689488 ignition[975]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 19:19:48.690416 ignition[975]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:19:48.690416 ignition[975]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:19:48.690416 ignition[975]: INFO : files: files passed Feb 13 19:19:48.690416 ignition[975]: INFO : Ignition finished successfully Feb 13 19:19:48.690551 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 19:19:48.697894 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 19:19:48.699042 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 19:19:48.700339 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 19:19:48.700400 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 19:19:48.707276 initrd-setup-root-after-ignition[1005]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:19:48.707276 initrd-setup-root-after-ignition[1005]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:19:48.708252 initrd-setup-root-after-ignition[1009]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:19:48.708840 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:19:48.709161 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 19:19:48.712928 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 19:19:48.723623 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 19:19:48.723674 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 19:19:48.724070 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 19:19:48.724194 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 19:19:48.724379 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 19:19:48.724819 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 19:19:48.733069 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:19:48.733886 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 19:19:48.740884 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:19:48.741155 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:19:48.741579 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 19:19:48.741849 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 19:19:48.741914 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:19:48.742410 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 19:19:48.742683 systemd[1]: Stopped target basic.target - Basic System. Feb 13 19:19:48.742942 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 19:19:48.743088 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:19:48.743638 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 19:19:48.743811 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 19:19:48.744083 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:19:48.744361 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 19:19:48.744667 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 19:19:48.745074 systemd[1]: Stopped target swap.target - Swaps. Feb 13 19:19:48.745188 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 19:19:48.745252 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:19:48.745796 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:19:48.746072 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:19:48.746338 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 19:19:48.746386 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:19:48.746553 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 19:19:48.746617 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 19:19:48.747313 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 19:19:48.747382 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:19:48.747590 systemd[1]: Stopped target paths.target - Path Units. Feb 13 19:19:48.747698 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 19:19:48.747754 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:19:48.747988 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 19:19:48.748180 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 19:19:48.748345 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 19:19:48.748391 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:19:48.748539 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 19:19:48.748580 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:19:48.748782 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 19:19:48.748842 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:19:48.749054 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 19:19:48.749112 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 19:19:48.765839 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 19:19:48.765967 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 19:19:48.766060 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:19:48.768859 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 19:19:48.768987 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 19:19:48.769105 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:19:48.769492 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 19:19:48.769784 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:19:48.776658 ignition[1029]: INFO : Ignition 2.20.0 Feb 13 19:19:48.776658 ignition[1029]: INFO : Stage: umount Feb 13 19:19:48.778936 ignition[1029]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:19:48.778936 ignition[1029]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:19:48.778936 ignition[1029]: INFO : umount: umount passed Feb 13 19:19:48.778936 ignition[1029]: INFO : Ignition finished successfully Feb 13 19:19:48.778065 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 19:19:48.778131 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 19:19:48.778466 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 19:19:48.778522 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 19:19:48.779918 systemd[1]: Stopped target network.target - Network. Feb 13 19:19:48.780986 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 19:19:48.781020 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 19:19:48.781151 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 19:19:48.781176 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 19:19:48.781791 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 19:19:48.781814 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 19:19:48.781947 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 19:19:48.781970 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 19:19:48.782131 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 19:19:48.782263 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 19:19:48.785167 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 19:19:48.785216 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 19:19:48.786852 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 19:19:48.786920 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Feb 13 19:19:48.787280 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 19:19:48.787321 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:19:48.788276 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 13 19:19:48.790591 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 19:19:48.790758 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 19:19:48.791555 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Feb 13 19:19:48.791774 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 19:19:48.791814 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:19:48.795762 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 19:19:48.795983 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 19:19:48.796010 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:19:48.796120 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Feb 13 19:19:48.796144 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 19:19:48.796243 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 19:19:48.796264 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:19:48.796400 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 19:19:48.796421 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 19:19:48.796536 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:19:48.797081 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 19:19:48.801619 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 19:19:48.801671 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 19:19:48.808076 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 19:19:48.808150 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:19:48.808458 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 19:19:48.808483 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 19:19:48.808686 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 19:19:48.808709 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:19:48.809507 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 19:19:48.809532 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:19:48.809826 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 19:19:48.809848 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 19:19:48.810143 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:19:48.810165 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:19:48.814909 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 19:19:48.815009 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 19:19:48.815036 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:19:48.815638 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:19:48.815664 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:19:48.816340 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Feb 13 19:19:48.816374 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Feb 13 19:19:48.817508 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 19:19:48.817550 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 19:19:48.862313 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 19:19:48.862389 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 19:19:48.862898 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 19:19:48.863471 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 19:19:48.863512 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 19:19:48.867800 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 19:19:48.873169 systemd[1]: Switching root. Feb 13 19:19:48.911730 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Feb 13 19:19:48.911773 systemd-journald[217]: Journal stopped Feb 13 19:19:49.926122 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 19:19:49.926142 kernel: SELinux: policy capability open_perms=1 Feb 13 19:19:49.926149 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 19:19:49.926155 kernel: SELinux: policy capability always_check_network=0 Feb 13 19:19:49.926160 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 19:19:49.926165 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 19:19:49.926172 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 19:19:49.926178 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 19:19:49.926184 systemd[1]: Successfully loaded SELinux policy in 32.664ms. Feb 13 19:19:49.926192 kernel: audit: type=1403 audit(1739474389.459:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 19:19:49.926198 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.206ms. Feb 13 19:19:49.926205 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 19:19:49.926211 systemd[1]: Detected virtualization vmware. Feb 13 19:19:49.926218 systemd[1]: Detected architecture x86-64. Feb 13 19:19:49.926224 systemd[1]: Detected first boot. Feb 13 19:19:49.926231 systemd[1]: Initializing machine ID from random generator. Feb 13 19:19:49.926237 zram_generator::config[1073]: No configuration found. Feb 13 19:19:49.926321 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Feb 13 19:19:49.926331 kernel: Guest personality initialized and is active Feb 13 19:19:49.926337 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Feb 13 19:19:49.926343 kernel: Initialized host personality Feb 13 19:19:49.926348 kernel: NET: Registered PF_VSOCK protocol family Feb 13 19:19:49.926355 systemd[1]: Populated /etc with preset unit settings. Feb 13 19:19:49.926362 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:19:49.926370 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Feb 13 19:19:49.926377 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Feb 13 19:19:49.926383 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 19:19:49.926389 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 19:19:49.926396 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 19:19:49.926402 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 19:19:49.926410 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 19:19:49.926417 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 19:19:49.926423 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 19:19:49.926430 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 19:19:49.926436 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 19:19:49.926443 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 19:19:49.926449 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 19:19:49.926456 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:19:49.926462 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:19:49.926470 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 19:19:49.926478 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 19:19:49.926485 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 19:19:49.926491 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 19:19:49.926498 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 19:19:49.926505 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:19:49.926511 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 19:19:49.926519 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 19:19:49.926525 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 19:19:49.926532 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 19:19:49.926538 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:19:49.926545 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 19:19:49.926552 systemd[1]: Reached target slices.target - Slice Units. Feb 13 19:19:49.926558 systemd[1]: Reached target swap.target - Swaps. Feb 13 19:19:49.926564 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 19:19:49.926571 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 19:19:49.926579 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Feb 13 19:19:49.926585 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:19:49.926592 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 19:19:49.926599 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:19:49.926607 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 19:19:49.926613 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 19:19:49.926620 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 19:19:49.926626 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 19:19:49.926633 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:19:49.926640 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 19:19:49.926646 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 19:19:49.926653 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 19:19:49.926660 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 19:19:49.926668 systemd[1]: Reached target machines.target - Containers. Feb 13 19:19:49.926675 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 19:19:49.926682 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Feb 13 19:19:49.926688 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 19:19:49.926695 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 19:19:49.931047 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 19:19:49.931063 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 19:19:49.931071 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 19:19:49.931082 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 19:19:49.931088 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 19:19:49.931095 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 19:19:49.931102 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 19:19:49.931109 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 19:19:49.931116 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 19:19:49.931122 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 19:19:49.931130 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 19:19:49.931138 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 19:19:49.931145 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 19:19:49.931152 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 19:19:49.931159 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 19:19:49.931165 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Feb 13 19:19:49.931186 systemd-journald[1166]: Collecting audit messages is disabled. Feb 13 19:19:49.931204 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 19:19:49.931211 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 19:19:49.931218 systemd[1]: Stopped verity-setup.service. Feb 13 19:19:49.931224 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:19:49.931231 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 19:19:49.931238 systemd-journald[1166]: Journal started Feb 13 19:19:49.931254 systemd-journald[1166]: Runtime Journal (/run/log/journal/5eecea88783644acabfed1f7aa143610) is 4.8M, max 38.6M, 33.8M free. Feb 13 19:19:49.789662 systemd[1]: Queued start job for default target multi-user.target. Feb 13 19:19:49.799068 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Feb 13 19:19:49.799326 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 19:19:49.933058 jq[1143]: true Feb 13 19:19:49.933834 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 19:19:49.934311 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 19:19:49.935799 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 19:19:49.937847 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 19:19:49.938299 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 19:19:49.938459 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 19:19:49.938710 kernel: fuse: init (API version 7.39) Feb 13 19:19:49.938718 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 19:19:49.940873 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:19:49.941112 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 19:19:49.941203 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 19:19:49.941470 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 19:19:49.941596 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 19:19:49.941834 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 19:19:49.941923 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 19:19:49.942149 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 19:19:49.942246 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 19:19:49.942668 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 19:19:49.944474 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 19:19:49.945801 jq[1188]: true Feb 13 19:19:49.955744 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 19:19:49.961268 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 19:19:49.975436 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 19:19:49.979176 kernel: loop: module loaded Feb 13 19:19:49.978750 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 19:19:49.978869 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 19:19:49.978887 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 19:19:49.979573 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Feb 13 19:19:49.982834 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 19:19:49.989085 kernel: ACPI: bus type drm_connector registered Feb 13 19:19:49.992368 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 19:19:49.992543 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:19:49.994810 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 19:19:49.997088 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 19:19:49.997218 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 19:19:50.010000 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 19:19:50.013106 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 19:19:50.014433 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 19:19:50.024616 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 19:19:50.026689 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 19:19:50.027025 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 19:19:50.027384 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 19:19:50.028739 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 19:19:50.029040 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Feb 13 19:19:50.029271 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 19:19:50.029446 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 19:19:50.035356 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 19:19:50.038449 systemd-journald[1166]: Time spent on flushing to /var/log/journal/5eecea88783644acabfed1f7aa143610 is 39.882ms for 1849 entries. Feb 13 19:19:50.038449 systemd-journald[1166]: System Journal (/var/log/journal/5eecea88783644acabfed1f7aa143610) is 8M, max 584.8M, 576.8M free. Feb 13 19:19:50.110586 systemd-journald[1166]: Received client request to flush runtime journal. Feb 13 19:19:50.110613 kernel: loop0: detected capacity change from 0 to 138176 Feb 13 19:19:50.042336 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 19:19:50.073108 ignition[1197]: Ignition 2.20.0 Feb 13 19:19:50.045694 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 19:19:50.073278 ignition[1197]: deleting config from guestinfo properties Feb 13 19:19:50.052812 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Feb 13 19:19:50.093024 ignition[1197]: Successfully deleted config Feb 13 19:19:50.052983 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 19:19:50.094298 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Feb 13 19:19:50.111397 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 19:19:50.118341 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:19:50.118895 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Feb 13 19:19:50.126357 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:19:50.132931 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 19:19:50.135977 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 19:19:50.135384 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 19:19:50.146594 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 19:19:50.150424 udevadm[1241]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Feb 13 19:19:50.158716 kernel: loop1: detected capacity change from 0 to 147912 Feb 13 19:19:50.161862 systemd-tmpfiles[1245]: ACLs are not supported, ignoring. Feb 13 19:19:50.162084 systemd-tmpfiles[1245]: ACLs are not supported, ignoring. Feb 13 19:19:50.168659 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:19:50.193720 kernel: loop2: detected capacity change from 0 to 2960 Feb 13 19:19:50.235441 kernel: loop3: detected capacity change from 0 to 218376 Feb 13 19:19:50.286805 kernel: loop4: detected capacity change from 0 to 138176 Feb 13 19:19:50.315357 kernel: loop5: detected capacity change from 0 to 147912 Feb 13 19:19:50.341885 kernel: loop6: detected capacity change from 0 to 2960 Feb 13 19:19:50.358881 kernel: loop7: detected capacity change from 0 to 218376 Feb 13 19:19:50.382342 (sd-merge)[1252]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Feb 13 19:19:50.382714 (sd-merge)[1252]: Merged extensions into '/usr'. Feb 13 19:19:50.388469 systemd[1]: Reload requested from client PID 1217 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 19:19:50.388525 systemd[1]: Reloading... Feb 13 19:19:50.444142 zram_generator::config[1279]: No configuration found. Feb 13 19:19:50.550542 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:19:50.572757 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:19:50.588679 ldconfig[1211]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 19:19:50.620385 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 19:19:50.620724 systemd[1]: Reloading finished in 231 ms. Feb 13 19:19:50.639011 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 19:19:50.639362 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 19:19:50.639629 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 19:19:50.644636 systemd[1]: Starting ensure-sysext.service... Feb 13 19:19:50.647229 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 19:19:50.649817 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:19:50.654941 systemd[1]: Reload requested from client PID 1337 ('systemctl') (unit ensure-sysext.service)... Feb 13 19:19:50.654983 systemd[1]: Reloading... Feb 13 19:19:50.660380 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 19:19:50.660547 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 19:19:50.661040 systemd-tmpfiles[1338]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 19:19:50.661198 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. Feb 13 19:19:50.661230 systemd-tmpfiles[1338]: ACLs are not supported, ignoring. Feb 13 19:19:50.664973 systemd-tmpfiles[1338]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 19:19:50.664980 systemd-tmpfiles[1338]: Skipping /boot Feb 13 19:19:50.672553 systemd-udevd[1339]: Using default interface naming scheme 'v255'. Feb 13 19:19:50.675297 systemd-tmpfiles[1338]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 19:19:50.675302 systemd-tmpfiles[1338]: Skipping /boot Feb 13 19:19:50.711716 zram_generator::config[1364]: No configuration found. Feb 13 19:19:50.798799 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1378) Feb 13 19:19:50.815381 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:19:50.817744 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Feb 13 19:19:50.828730 kernel: ACPI: button: Power Button [PWRF] Feb 13 19:19:50.836601 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:19:50.888934 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 19:19:50.889068 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 19:19:50.889344 systemd[1]: Reloading finished in 234 ms. Feb 13 19:19:50.895531 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:19:50.899943 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Feb 13 19:19:50.903177 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:19:50.923214 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:19:50.927508 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 19:19:50.927839 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Feb 13 19:19:50.928633 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 19:19:50.930336 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 19:19:50.930977 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 19:19:50.931818 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 19:19:50.932826 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:19:50.939075 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 19:19:50.939265 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 19:19:50.941270 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 19:19:50.943158 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 19:19:50.945886 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 19:19:50.948888 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 19:19:50.949002 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:19:50.951289 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 19:19:50.951772 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 19:19:50.952129 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 19:19:50.952736 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 19:19:50.953063 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 19:19:50.953157 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 19:19:50.959114 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:19:50.964119 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 19:19:50.972366 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 19:19:50.975852 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 19:19:50.981019 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 19:19:50.981184 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:19:50.981254 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 19:19:50.981354 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:19:50.985542 systemd[1]: Finished ensure-sysext.service. Feb 13 19:19:50.985839 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 19:19:50.994068 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 19:19:50.996714 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 19:19:51.000781 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 19:19:51.000885 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 19:19:51.001734 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 19:19:51.002811 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 19:19:51.003800 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 19:19:51.004121 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 19:19:51.004224 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 19:19:51.010019 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 19:19:51.011553 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 19:19:51.016598 (udev-worker)[1375]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Feb 13 19:19:51.017738 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 19:19:51.018090 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 19:19:51.018188 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 19:19:51.020106 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 19:19:51.023579 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 19:19:51.023671 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 19:19:51.027850 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 19:19:51.033875 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:19:51.037279 augenrules[1503]: No rules Feb 13 19:19:51.037346 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 19:19:51.037465 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 19:19:51.037775 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 19:19:51.044917 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 19:19:51.050894 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 19:19:51.051546 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 19:19:51.061483 lvm[1513]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 19:19:51.089067 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 19:19:51.089293 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:19:51.093838 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 19:19:51.098685 lvm[1523]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 19:19:51.117739 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:19:51.128921 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 19:19:51.129638 systemd-networkd[1459]: lo: Link UP Feb 13 19:19:51.129791 systemd-networkd[1459]: lo: Gained carrier Feb 13 19:19:51.130225 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 19:19:51.130653 systemd-networkd[1459]: Enumeration completed Feb 13 19:19:51.130779 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 19:19:51.130920 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 19:19:51.131020 systemd-networkd[1459]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Feb 13 19:19:51.132934 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 19:19:51.133054 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 19:19:51.133402 systemd-networkd[1459]: ens192: Link UP Feb 13 19:19:51.133526 systemd-networkd[1459]: ens192: Gained carrier Feb 13 19:19:51.135276 systemd-timesyncd[1481]: Network configuration changed, trying to establish connection. Feb 13 19:19:51.135480 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Feb 13 19:19:51.139237 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 19:19:51.141661 systemd-resolved[1460]: Positive Trust Anchors: Feb 13 19:19:51.142567 systemd-resolved[1460]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 19:19:51.142591 systemd-resolved[1460]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 19:19:51.145594 systemd-resolved[1460]: Defaulting to hostname 'linux'. Feb 13 19:19:51.146607 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 19:19:51.146776 systemd[1]: Reached target network.target - Network. Feb 13 19:19:51.146875 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:19:51.146994 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 19:19:51.147169 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 19:19:51.147301 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 19:19:51.147533 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 19:19:51.147682 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 19:19:51.147807 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 19:19:51.147921 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 19:19:51.147940 systemd[1]: Reached target paths.target - Path Units. Feb 13 19:19:51.148032 systemd[1]: Reached target timers.target - Timer Units. Feb 13 19:19:51.148571 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 19:19:51.149568 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 19:19:51.151073 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Feb 13 19:19:51.151273 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Feb 13 19:19:51.151397 systemd[1]: Reached target ssh-access.target - SSH Access Available. Feb 13 19:19:51.157004 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 19:19:51.157417 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Feb 13 19:19:51.158153 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Feb 13 19:19:51.158394 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 19:19:51.158830 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 19:19:51.158968 systemd[1]: Reached target basic.target - Basic System. Feb 13 19:19:51.159131 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 19:19:51.159158 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 19:19:51.160001 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 19:19:51.161908 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 19:19:51.163432 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 19:19:51.165924 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 19:19:51.166052 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 19:19:51.167816 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 19:19:51.168836 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Feb 13 19:19:51.169797 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 19:19:51.172832 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 19:19:51.174314 jq[1536]: false Feb 13 19:19:51.175083 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 19:19:51.175652 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 19:19:51.177118 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 19:19:51.177808 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 19:19:51.179744 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 19:19:51.184782 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Feb 13 19:19:51.184834 dbus-daemon[1535]: [system] SELinux support is enabled Feb 13 19:19:51.185206 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 19:19:51.191898 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 19:19:51.192842 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 19:19:51.193050 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 19:19:51.193160 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 19:19:51.195435 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 19:19:51.195476 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 19:19:51.196764 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 19:19:51.196775 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 19:19:51.200326 jq[1545]: true Feb 13 19:19:51.204046 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 19:19:51.204178 extend-filesystems[1537]: Found loop4 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found loop5 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found loop6 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found loop7 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda1 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda2 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda3 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found usr Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda4 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda6 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda7 Feb 13 19:19:51.204828 extend-filesystems[1537]: Found sda9 Feb 13 19:19:51.204828 extend-filesystems[1537]: Checking size of /dev/sda9 Feb 13 19:19:51.204184 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 19:19:51.210077 (ntainerd)[1558]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 19:19:51.215406 jq[1563]: true Feb 13 19:19:51.226653 extend-filesystems[1537]: Old size kept for /dev/sda9 Feb 13 19:19:51.226867 extend-filesystems[1537]: Found sr0 Feb 13 19:19:51.233133 update_engine[1543]: I20250213 19:19:51.233085 1543 main.cc:92] Flatcar Update Engine starting Feb 13 19:19:51.233854 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 19:19:51.233995 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 19:19:51.234893 tar[1559]: linux-amd64/LICENSE Feb 13 19:19:51.235221 tar[1559]: linux-amd64/helm Feb 13 19:19:51.241742 update_engine[1543]: I20250213 19:19:51.240689 1543 update_check_scheduler.cc:74] Next update check in 6m31s Feb 13 19:19:51.240493 systemd[1]: Started update-engine.service - Update Engine. Feb 13 19:19:51.250809 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 19:19:51.251793 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Feb 13 19:19:51.255755 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Feb 13 19:19:51.304780 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1374) Feb 13 19:19:51.312080 systemd-logind[1542]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 19:19:51.312272 systemd-logind[1542]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 19:19:51.312377 systemd-logind[1542]: New seat seat0. Feb 13 19:19:51.314663 bash[1594]: Updated "/home/core/.ssh/authorized_keys" Feb 13 19:19:51.320795 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Feb 13 19:19:51.321015 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 19:19:51.321284 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 19:19:51.333839 unknown[1579]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Feb 13 19:19:51.334411 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Feb 13 19:19:51.341129 unknown[1579]: Core dump limit set to -1 Feb 13 19:21:05.362960 systemd-timesyncd[1481]: Contacted time server 198.137.202.56:123 (0.flatcar.pool.ntp.org). Feb 13 19:21:05.362989 systemd-timesyncd[1481]: Initial clock synchronization to Thu 2025-02-13 19:21:05.362893 UTC. Feb 13 19:21:05.363016 systemd-resolved[1460]: Clock change detected. Flushing caches. Feb 13 19:21:05.485700 locksmithd[1576]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 19:21:05.527875 containerd[1558]: time="2025-02-13T19:21:05.527809930Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 19:21:05.569736 containerd[1558]: time="2025-02-13T19:21:05.569209458Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571138238Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571155223Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571165580Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571245578Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571254708Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571289643Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571297049Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571384933Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571392650Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571399396Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572050 containerd[1558]: time="2025-02-13T19:21:05.571404797Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572207 containerd[1558]: time="2025-02-13T19:21:05.571442442Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572207 containerd[1558]: time="2025-02-13T19:21:05.571602926Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572207 containerd[1558]: time="2025-02-13T19:21:05.571678145Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:21:05.572207 containerd[1558]: time="2025-02-13T19:21:05.571701854Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 19:21:05.572207 containerd[1558]: time="2025-02-13T19:21:05.571756873Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 19:21:05.572207 containerd[1558]: time="2025-02-13T19:21:05.571783677Z" level=info msg="metadata content store policy set" policy=shared Feb 13 19:21:05.574978 containerd[1558]: time="2025-02-13T19:21:05.574965968Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 19:21:05.575147 containerd[1558]: time="2025-02-13T19:21:05.575138548Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 19:21:05.575333 containerd[1558]: time="2025-02-13T19:21:05.575323584Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 19:21:05.575707 containerd[1558]: time="2025-02-13T19:21:05.575598559Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 19:21:05.575707 containerd[1558]: time="2025-02-13T19:21:05.575615430Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 19:21:05.575707 containerd[1558]: time="2025-02-13T19:21:05.575682610Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 19:21:05.575908 containerd[1558]: time="2025-02-13T19:21:05.575897615Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 19:21:05.576208 containerd[1558]: time="2025-02-13T19:21:05.576197791Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 19:21:05.576449 containerd[1558]: time="2025-02-13T19:21:05.576440475Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 19:21:05.576487 containerd[1558]: time="2025-02-13T19:21:05.576479835Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 19:21:05.576520 containerd[1558]: time="2025-02-13T19:21:05.576513531Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.576561 containerd[1558]: time="2025-02-13T19:21:05.576554018Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.576594 containerd[1558]: time="2025-02-13T19:21:05.576587899Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.576645 containerd[1558]: time="2025-02-13T19:21:05.576638204Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576833889Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576845839Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576853214Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576859645Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576871316Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576879388Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576886315Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576893489Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576899991Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576907091Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576913852Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576921304Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576928279Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577333 containerd[1558]: time="2025-02-13T19:21:05.576937016Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577579 containerd[1558]: time="2025-02-13T19:21:05.576943465Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577579 containerd[1558]: time="2025-02-13T19:21:05.576949667Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577579 containerd[1558]: time="2025-02-13T19:21:05.576957254Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577579 containerd[1558]: time="2025-02-13T19:21:05.576965555Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 19:21:05.577579 containerd[1558]: time="2025-02-13T19:21:05.576977730Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577579 containerd[1558]: time="2025-02-13T19:21:05.576988790Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.577579 containerd[1558]: time="2025-02-13T19:21:05.576995307Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 19:21:05.578518 containerd[1558]: time="2025-02-13T19:21:05.578433551Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 19:21:05.578518 containerd[1558]: time="2025-02-13T19:21:05.578450033Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 19:21:05.578518 containerd[1558]: time="2025-02-13T19:21:05.578456810Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 19:21:05.578518 containerd[1558]: time="2025-02-13T19:21:05.578464669Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 19:21:05.578518 containerd[1558]: time="2025-02-13T19:21:05.578470150Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.579406 containerd[1558]: time="2025-02-13T19:21:05.578477211Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 19:21:05.579406 containerd[1558]: time="2025-02-13T19:21:05.578608208Z" level=info msg="NRI interface is disabled by configuration." Feb 13 19:21:05.579406 containerd[1558]: time="2025-02-13T19:21:05.578614594Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 19:21:05.579463 containerd[1558]: time="2025-02-13T19:21:05.578770843Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 19:21:05.579463 containerd[1558]: time="2025-02-13T19:21:05.578801683Z" level=info msg="Connect containerd service" Feb 13 19:21:05.579827 containerd[1558]: time="2025-02-13T19:21:05.579566717Z" level=info msg="using legacy CRI server" Feb 13 19:21:05.579827 containerd[1558]: time="2025-02-13T19:21:05.579575974Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 19:21:05.579827 containerd[1558]: time="2025-02-13T19:21:05.579643741Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 19:21:05.580356 containerd[1558]: time="2025-02-13T19:21:05.580332368Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 19:21:05.580853 containerd[1558]: time="2025-02-13T19:21:05.580837253Z" level=info msg="Start subscribing containerd event" Feb 13 19:21:05.581223 containerd[1558]: time="2025-02-13T19:21:05.580985139Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 19:21:05.581258 containerd[1558]: time="2025-02-13T19:21:05.581247311Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 19:21:05.581292 containerd[1558]: time="2025-02-13T19:21:05.581212255Z" level=info msg="Start recovering state" Feb 13 19:21:05.581355 containerd[1558]: time="2025-02-13T19:21:05.581347348Z" level=info msg="Start event monitor" Feb 13 19:21:05.581705 containerd[1558]: time="2025-02-13T19:21:05.581591357Z" level=info msg="Start snapshots syncer" Feb 13 19:21:05.581705 containerd[1558]: time="2025-02-13T19:21:05.581602927Z" level=info msg="Start cni network conf syncer for default" Feb 13 19:21:05.581705 containerd[1558]: time="2025-02-13T19:21:05.581607929Z" level=info msg="Start streaming server" Feb 13 19:21:05.581705 containerd[1558]: time="2025-02-13T19:21:05.581640190Z" level=info msg="containerd successfully booted in 0.054517s" Feb 13 19:21:05.581694 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 19:21:05.642820 sshd_keygen[1572]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 19:21:05.656710 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 19:21:05.666009 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 19:21:05.669890 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 19:21:05.670021 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 19:21:05.672070 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 19:21:05.679139 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 19:21:05.683040 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 19:21:05.685976 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 19:21:05.686158 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 19:21:05.750776 tar[1559]: linux-amd64/README.md Feb 13 19:21:05.760586 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Feb 13 19:21:06.328909 systemd-networkd[1459]: ens192: Gained IPv6LL Feb 13 19:21:06.330990 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 19:21:06.331893 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 19:21:06.337075 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Feb 13 19:21:06.341071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:21:06.343008 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 19:21:06.368286 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 19:21:06.374885 systemd[1]: coreos-metadata.service: Deactivated successfully. Feb 13 19:21:06.375015 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Feb 13 19:21:06.375533 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 19:21:07.051858 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:21:07.052185 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 19:21:07.052534 systemd[1]: Startup finished in 922ms (kernel) + 5.836s (initrd) + 3.622s (userspace) = 10.380s. Feb 13 19:21:07.056307 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:21:07.081296 login[1676]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 19:21:07.081455 login[1677]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 19:21:07.089758 systemd-logind[1542]: New session 2 of user core. Feb 13 19:21:07.091174 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 19:21:07.096990 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 19:21:07.099852 systemd-logind[1542]: New session 1 of user core. Feb 13 19:21:07.104858 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 19:21:07.115006 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 19:21:07.116837 (systemd)[1718]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 19:21:07.118241 systemd-logind[1542]: New session c1 of user core. Feb 13 19:21:07.200321 systemd[1718]: Queued start job for default target default.target. Feb 13 19:21:07.211748 systemd[1718]: Created slice app.slice - User Application Slice. Feb 13 19:21:07.211969 systemd[1718]: Reached target paths.target - Paths. Feb 13 19:21:07.212049 systemd[1718]: Reached target timers.target - Timers. Feb 13 19:21:07.212790 systemd[1718]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 19:21:07.219374 systemd[1718]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 19:21:07.219406 systemd[1718]: Reached target sockets.target - Sockets. Feb 13 19:21:07.219462 systemd[1718]: Reached target basic.target - Basic System. Feb 13 19:21:07.219508 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 19:21:07.219663 systemd[1718]: Reached target default.target - Main User Target. Feb 13 19:21:07.219680 systemd[1718]: Startup finished in 97ms. Feb 13 19:21:07.220905 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 19:21:07.221829 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 19:21:07.553843 kubelet[1711]: E0213 19:21:07.553802 1711 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:21:07.555229 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:21:07.555314 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:21:07.555591 systemd[1]: kubelet.service: Consumed 592ms CPU time, 253M memory peak. Feb 13 19:21:17.805763 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 19:21:17.818025 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:21:17.891469 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:21:17.893950 (kubelet)[1762]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:21:17.950915 kubelet[1762]: E0213 19:21:17.950875 1762 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:21:17.953215 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:21:17.953301 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:21:17.953507 systemd[1]: kubelet.service: Consumed 100ms CPU time, 104.5M memory peak. Feb 13 19:21:28.203775 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Feb 13 19:21:28.212932 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:21:28.529547 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:21:28.532361 (kubelet)[1777]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:21:28.578661 kubelet[1777]: E0213 19:21:28.578624 1777 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:21:28.580286 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:21:28.580426 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:21:28.580745 systemd[1]: kubelet.service: Consumed 77ms CPU time, 102.1M memory peak. Feb 13 19:21:38.830760 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Feb 13 19:21:38.838081 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:21:39.165398 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:21:39.168110 (kubelet)[1792]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:21:39.197492 kubelet[1792]: E0213 19:21:39.197457 1792 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:21:39.198599 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:21:39.198675 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:21:39.198857 systemd[1]: kubelet.service: Consumed 89ms CPU time, 101.8M memory peak. Feb 13 19:21:45.452886 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 19:21:45.453882 systemd[1]: Started sshd@0-139.178.70.106:22-139.178.89.65:53320.service - OpenSSH per-connection server daemon (139.178.89.65:53320). Feb 13 19:21:45.522455 sshd[1800]: Accepted publickey for core from 139.178.89.65 port 53320 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:21:45.523290 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:21:45.527866 systemd-logind[1542]: New session 3 of user core. Feb 13 19:21:45.534935 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 19:21:45.591023 systemd[1]: Started sshd@1-139.178.70.106:22-139.178.89.65:53326.service - OpenSSH per-connection server daemon (139.178.89.65:53326). Feb 13 19:21:45.621785 sshd[1805]: Accepted publickey for core from 139.178.89.65 port 53326 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:21:45.622632 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:21:45.625840 systemd-logind[1542]: New session 4 of user core. Feb 13 19:21:45.637020 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 19:21:45.684673 sshd[1807]: Connection closed by 139.178.89.65 port 53326 Feb 13 19:21:45.685458 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Feb 13 19:21:45.690269 systemd[1]: Started sshd@2-139.178.70.106:22-139.178.89.65:53330.service - OpenSSH per-connection server daemon (139.178.89.65:53330). Feb 13 19:21:45.690729 systemd[1]: sshd@1-139.178.70.106:22-139.178.89.65:53326.service: Deactivated successfully. Feb 13 19:21:45.691766 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 19:21:45.692937 systemd-logind[1542]: Session 4 logged out. Waiting for processes to exit. Feb 13 19:21:45.693713 systemd-logind[1542]: Removed session 4. Feb 13 19:21:45.719567 sshd[1810]: Accepted publickey for core from 139.178.89.65 port 53330 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:21:45.720247 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:21:45.723788 systemd-logind[1542]: New session 5 of user core. Feb 13 19:21:45.729897 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 19:21:45.776774 sshd[1815]: Connection closed by 139.178.89.65 port 53330 Feb 13 19:21:45.777081 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Feb 13 19:21:45.785834 systemd[1]: sshd@2-139.178.70.106:22-139.178.89.65:53330.service: Deactivated successfully. Feb 13 19:21:45.786577 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 19:21:45.786997 systemd-logind[1542]: Session 5 logged out. Waiting for processes to exit. Feb 13 19:21:45.787890 systemd[1]: Started sshd@3-139.178.70.106:22-139.178.89.65:53342.service - OpenSSH per-connection server daemon (139.178.89.65:53342). Feb 13 19:21:45.788983 systemd-logind[1542]: Removed session 5. Feb 13 19:21:45.817327 sshd[1820]: Accepted publickey for core from 139.178.89.65 port 53342 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:21:45.818150 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:21:45.821374 systemd-logind[1542]: New session 6 of user core. Feb 13 19:21:45.829995 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 19:21:45.879596 sshd[1823]: Connection closed by 139.178.89.65 port 53342 Feb 13 19:21:45.879471 sshd-session[1820]: pam_unix(sshd:session): session closed for user core Feb 13 19:21:45.891372 systemd[1]: sshd@3-139.178.70.106:22-139.178.89.65:53342.service: Deactivated successfully. Feb 13 19:21:45.892378 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 19:21:45.893317 systemd-logind[1542]: Session 6 logged out. Waiting for processes to exit. Feb 13 19:21:45.896045 systemd[1]: Started sshd@4-139.178.70.106:22-139.178.89.65:53350.service - OpenSSH per-connection server daemon (139.178.89.65:53350). Feb 13 19:21:45.897527 systemd-logind[1542]: Removed session 6. Feb 13 19:21:45.925261 sshd[1828]: Accepted publickey for core from 139.178.89.65 port 53350 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:21:45.926037 sshd-session[1828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:21:45.929466 systemd-logind[1542]: New session 7 of user core. Feb 13 19:21:45.938903 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 19:21:45.994167 sudo[1832]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 19:21:45.994320 sudo[1832]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:21:46.011144 sudo[1832]: pam_unix(sudo:session): session closed for user root Feb 13 19:21:46.011965 sshd[1831]: Connection closed by 139.178.89.65 port 53350 Feb 13 19:21:46.012762 sshd-session[1828]: pam_unix(sshd:session): session closed for user core Feb 13 19:21:46.022232 systemd[1]: sshd@4-139.178.70.106:22-139.178.89.65:53350.service: Deactivated successfully. Feb 13 19:21:46.023049 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 19:21:46.023807 systemd-logind[1542]: Session 7 logged out. Waiting for processes to exit. Feb 13 19:21:46.028055 systemd[1]: Started sshd@5-139.178.70.106:22-139.178.89.65:53366.service - OpenSSH per-connection server daemon (139.178.89.65:53366). Feb 13 19:21:46.029035 systemd-logind[1542]: Removed session 7. Feb 13 19:21:46.055961 sshd[1837]: Accepted publickey for core from 139.178.89.65 port 53366 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:21:46.056776 sshd-session[1837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:21:46.059706 systemd-logind[1542]: New session 8 of user core. Feb 13 19:21:46.070904 systemd[1]: Started session-8.scope - Session 8 of User core. Feb 13 19:21:46.119219 sudo[1842]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 19:21:46.119420 sudo[1842]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:21:46.121774 sudo[1842]: pam_unix(sudo:session): session closed for user root Feb 13 19:21:46.125438 sudo[1841]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 19:21:46.125799 sudo[1841]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:21:46.136130 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 19:21:46.154869 augenrules[1864]: No rules Feb 13 19:21:46.155347 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 19:21:46.155559 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 19:21:46.156235 sudo[1841]: pam_unix(sudo:session): session closed for user root Feb 13 19:21:46.157110 sshd[1840]: Connection closed by 139.178.89.65 port 53366 Feb 13 19:21:46.157827 sshd-session[1837]: pam_unix(sshd:session): session closed for user core Feb 13 19:21:46.163375 systemd[1]: sshd@5-139.178.70.106:22-139.178.89.65:53366.service: Deactivated successfully. Feb 13 19:21:46.164350 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 19:21:46.165306 systemd-logind[1542]: Session 8 logged out. Waiting for processes to exit. Feb 13 19:21:46.169972 systemd[1]: Started sshd@6-139.178.70.106:22-139.178.89.65:53382.service - OpenSSH per-connection server daemon (139.178.89.65:53382). Feb 13 19:21:46.170562 systemd-logind[1542]: Removed session 8. Feb 13 19:21:46.197897 sshd[1872]: Accepted publickey for core from 139.178.89.65 port 53382 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:21:46.198657 sshd-session[1872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:21:46.201747 systemd-logind[1542]: New session 9 of user core. Feb 13 19:21:46.209995 systemd[1]: Started session-9.scope - Session 9 of User core. Feb 13 19:21:46.260088 sudo[1876]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 19:21:46.260291 sudo[1876]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:21:46.573997 systemd[1]: Starting docker.service - Docker Application Container Engine... Feb 13 19:21:46.574087 (dockerd)[1893]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Feb 13 19:21:46.896053 dockerd[1893]: time="2025-02-13T19:21:46.895720083Z" level=info msg="Starting up" Feb 13 19:21:46.989695 dockerd[1893]: time="2025-02-13T19:21:46.989672766Z" level=info msg="Loading containers: start." Feb 13 19:21:47.085854 kernel: Initializing XFRM netlink socket Feb 13 19:21:47.139561 systemd-networkd[1459]: docker0: Link UP Feb 13 19:21:47.158994 dockerd[1893]: time="2025-02-13T19:21:47.158756745Z" level=info msg="Loading containers: done." Feb 13 19:21:47.168452 dockerd[1893]: time="2025-02-13T19:21:47.168424242Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 19:21:47.168539 dockerd[1893]: time="2025-02-13T19:21:47.168488722Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Feb 13 19:21:47.168564 dockerd[1893]: time="2025-02-13T19:21:47.168546871Z" level=info msg="Daemon has completed initialization" Feb 13 19:21:47.182564 dockerd[1893]: time="2025-02-13T19:21:47.182371348Z" level=info msg="API listen on /run/docker.sock" Feb 13 19:21:47.182445 systemd[1]: Started docker.service - Docker Application Container Engine. Feb 13 19:21:47.727080 containerd[1558]: time="2025-02-13T19:21:47.726995607Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.2\"" Feb 13 19:21:48.352036 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount268537552.mount: Deactivated successfully. Feb 13 19:21:49.280515 containerd[1558]: time="2025-02-13T19:21:49.280467556Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:49.281298 containerd[1558]: time="2025-02-13T19:21:49.281034899Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.2: active requests=0, bytes read=28673931" Feb 13 19:21:49.281298 containerd[1558]: time="2025-02-13T19:21:49.281053664Z" level=info msg="ImageCreate event name:\"sha256:85b7a174738baecbc53029b7913cd430a2060e0cbdb5f56c7957d32ff7f241ef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:49.282729 containerd[1558]: time="2025-02-13T19:21:49.282714484Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c47449f3e751588ea0cb74e325e0f83db335a415f4f4c7fb147375dd6c84757f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:49.283392 containerd[1558]: time="2025-02-13T19:21:49.283378513Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.2\" with image id \"sha256:85b7a174738baecbc53029b7913cd430a2060e0cbdb5f56c7957d32ff7f241ef\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c47449f3e751588ea0cb74e325e0f83db335a415f4f4c7fb147375dd6c84757f\", size \"28670731\" in 1.556358358s" Feb 13 19:21:49.283444 containerd[1558]: time="2025-02-13T19:21:49.283434630Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.2\" returns image reference \"sha256:85b7a174738baecbc53029b7913cd430a2060e0cbdb5f56c7957d32ff7f241ef\"" Feb 13 19:21:49.283762 containerd[1558]: time="2025-02-13T19:21:49.283747174Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.2\"" Feb 13 19:21:49.436009 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Feb 13 19:21:49.441979 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:21:49.506676 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:21:49.509150 (kubelet)[2144]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:21:49.537497 kubelet[2144]: E0213 19:21:49.537413 2144 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:21:49.538619 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:21:49.538695 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:21:49.539051 systemd[1]: kubelet.service: Consumed 80ms CPU time, 103.8M memory peak. Feb 13 19:21:50.675884 update_engine[1543]: I20250213 19:21:50.675838 1543 update_attempter.cc:509] Updating boot flags... Feb 13 19:21:50.715875 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2164) Feb 13 19:21:51.080846 containerd[1558]: time="2025-02-13T19:21:51.080653626Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:51.087957 containerd[1558]: time="2025-02-13T19:21:51.087930245Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.2: active requests=0, bytes read=24771784" Feb 13 19:21:51.096284 containerd[1558]: time="2025-02-13T19:21:51.096253217Z" level=info msg="ImageCreate event name:\"sha256:b6a454c5a800d201daacead6ff195ec6049fe6dc086621b0670bca912efaf389\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:51.102733 containerd[1558]: time="2025-02-13T19:21:51.102703516Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:399aa50f4d1361c59dc458e634506d02de32613d03a9a614a21058741162ef90\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:51.103485 containerd[1558]: time="2025-02-13T19:21:51.103400274Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.2\" with image id \"sha256:b6a454c5a800d201daacead6ff195ec6049fe6dc086621b0670bca912efaf389\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:399aa50f4d1361c59dc458e634506d02de32613d03a9a614a21058741162ef90\", size \"26259392\" in 1.819586088s" Feb 13 19:21:51.103485 containerd[1558]: time="2025-02-13T19:21:51.103423319Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.2\" returns image reference \"sha256:b6a454c5a800d201daacead6ff195ec6049fe6dc086621b0670bca912efaf389\"" Feb 13 19:21:51.104110 containerd[1558]: time="2025-02-13T19:21:51.103930222Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.2\"" Feb 13 19:21:52.232737 containerd[1558]: time="2025-02-13T19:21:52.232693930Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:52.233384 containerd[1558]: time="2025-02-13T19:21:52.233360312Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.2: active requests=0, bytes read=19170276" Feb 13 19:21:52.233434 containerd[1558]: time="2025-02-13T19:21:52.233372704Z" level=info msg="ImageCreate event name:\"sha256:d8e673e7c9983f1f53569a9d2ba786c8abb42e3f744f77dc97a595f3caf9435d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:52.235505 containerd[1558]: time="2025-02-13T19:21:52.235485721Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:45710d74cfd5aa10a001d0cf81747b77c28617444ffee0503d12f1dcd7450f76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:52.237802 containerd[1558]: time="2025-02-13T19:21:52.237784271Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.2\" with image id \"sha256:d8e673e7c9983f1f53569a9d2ba786c8abb42e3f744f77dc97a595f3caf9435d\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:45710d74cfd5aa10a001d0cf81747b77c28617444ffee0503d12f1dcd7450f76\", size \"20657902\" in 1.133836407s" Feb 13 19:21:52.237847 containerd[1558]: time="2025-02-13T19:21:52.237802887Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.2\" returns image reference \"sha256:d8e673e7c9983f1f53569a9d2ba786c8abb42e3f744f77dc97a595f3caf9435d\"" Feb 13 19:21:52.238196 containerd[1558]: time="2025-02-13T19:21:52.238069461Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.2\"" Feb 13 19:21:53.419543 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2892482789.mount: Deactivated successfully. Feb 13 19:21:53.775390 containerd[1558]: time="2025-02-13T19:21:53.775353013Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:53.780666 containerd[1558]: time="2025-02-13T19:21:53.780558352Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.2: active requests=0, bytes read=30908839" Feb 13 19:21:53.787050 containerd[1558]: time="2025-02-13T19:21:53.787016305Z" level=info msg="ImageCreate event name:\"sha256:f1332858868e1c6a905123b21e2e322ab45a5b99a3532e68ff49a87c2266ebc5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:53.792645 containerd[1558]: time="2025-02-13T19:21:53.792464662Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:83c025f0faa6799fab6645102a98138e39a9a7db2be3bc792c79d72659b1805d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:53.793027 containerd[1558]: time="2025-02-13T19:21:53.793005506Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.2\" with image id \"sha256:f1332858868e1c6a905123b21e2e322ab45a5b99a3532e68ff49a87c2266ebc5\", repo tag \"registry.k8s.io/kube-proxy:v1.32.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:83c025f0faa6799fab6645102a98138e39a9a7db2be3bc792c79d72659b1805d\", size \"30907858\" in 1.554921607s" Feb 13 19:21:53.793072 containerd[1558]: time="2025-02-13T19:21:53.793029455Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.2\" returns image reference \"sha256:f1332858868e1c6a905123b21e2e322ab45a5b99a3532e68ff49a87c2266ebc5\"" Feb 13 19:21:53.793772 containerd[1558]: time="2025-02-13T19:21:53.793753049Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Feb 13 19:21:54.304382 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3026507171.mount: Deactivated successfully. Feb 13 19:21:55.006471 containerd[1558]: time="2025-02-13T19:21:55.006375139Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:55.016686 containerd[1558]: time="2025-02-13T19:21:55.016504352Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Feb 13 19:21:55.023100 containerd[1558]: time="2025-02-13T19:21:55.023063703Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:55.027891 containerd[1558]: time="2025-02-13T19:21:55.027867475Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:55.028745 containerd[1558]: time="2025-02-13T19:21:55.028721416Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.234947083s" Feb 13 19:21:55.028785 containerd[1558]: time="2025-02-13T19:21:55.028744863Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Feb 13 19:21:55.029078 containerd[1558]: time="2025-02-13T19:21:55.029063633Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Feb 13 19:21:55.494338 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4205536341.mount: Deactivated successfully. Feb 13 19:21:55.496106 containerd[1558]: time="2025-02-13T19:21:55.496040195Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:55.496508 containerd[1558]: time="2025-02-13T19:21:55.496484601Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Feb 13 19:21:55.497119 containerd[1558]: time="2025-02-13T19:21:55.496550028Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:55.497787 containerd[1558]: time="2025-02-13T19:21:55.497766427Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:55.498583 containerd[1558]: time="2025-02-13T19:21:55.498240890Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 469.084899ms" Feb 13 19:21:55.498583 containerd[1558]: time="2025-02-13T19:21:55.498257417Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Feb 13 19:21:55.498583 containerd[1558]: time="2025-02-13T19:21:55.498502652Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Feb 13 19:21:56.031109 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3329420029.mount: Deactivated successfully. Feb 13 19:21:59.306732 containerd[1558]: time="2025-02-13T19:21:59.306443182Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:59.307095 containerd[1558]: time="2025-02-13T19:21:59.307069962Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551320" Feb 13 19:21:59.307261 containerd[1558]: time="2025-02-13T19:21:59.307240869Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:59.308907 containerd[1558]: time="2025-02-13T19:21:59.308878333Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:21:59.309671 containerd[1558]: time="2025-02-13T19:21:59.309561832Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.811047182s" Feb 13 19:21:59.309671 containerd[1558]: time="2025-02-13T19:21:59.309577998Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Feb 13 19:21:59.542105 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Feb 13 19:21:59.550999 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:22:00.016901 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:22:00.018489 (kubelet)[2319]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:22:00.043782 kubelet[2319]: E0213 19:22:00.043092 2319 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:22:00.044253 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:22:00.044344 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:22:00.044520 systemd[1]: kubelet.service: Consumed 80ms CPU time, 102.9M memory peak. Feb 13 19:22:01.203342 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:22:01.203484 systemd[1]: kubelet.service: Consumed 80ms CPU time, 102.9M memory peak. Feb 13 19:22:01.206952 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:22:01.226467 systemd[1]: Reload requested from client PID 2335 ('systemctl') (unit session-9.scope)... Feb 13 19:22:01.226480 systemd[1]: Reloading... Feb 13 19:22:01.302838 zram_generator::config[2386]: No configuration found. Feb 13 19:22:01.347878 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:22:01.365776 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:22:01.429861 systemd[1]: Reloading finished in 203 ms. Feb 13 19:22:01.468444 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Feb 13 19:22:01.468517 systemd[1]: kubelet.service: Failed with result 'signal'. Feb 13 19:22:01.468730 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:22:01.474075 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:22:02.037611 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:22:02.041254 (kubelet)[2446]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 19:22:02.119131 kubelet[2446]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:22:02.119131 kubelet[2446]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Feb 13 19:22:02.119131 kubelet[2446]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:22:02.119399 kubelet[2446]: I0213 19:22:02.119196 2446 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 19:22:02.438978 kubelet[2446]: I0213 19:22:02.438959 2446 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Feb 13 19:22:02.439839 kubelet[2446]: I0213 19:22:02.439065 2446 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 19:22:02.439839 kubelet[2446]: I0213 19:22:02.439294 2446 server.go:954] "Client rotation is on, will bootstrap in background" Feb 13 19:22:02.571532 kubelet[2446]: E0213 19:22:02.571498 2446 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:02.573679 kubelet[2446]: I0213 19:22:02.573657 2446 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 19:22:02.588247 kubelet[2446]: E0213 19:22:02.588196 2446 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Feb 13 19:22:02.588247 kubelet[2446]: I0213 19:22:02.588221 2446 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Feb 13 19:22:02.592735 kubelet[2446]: I0213 19:22:02.592682 2446 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 19:22:02.596838 kubelet[2446]: I0213 19:22:02.596806 2446 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 19:22:02.596949 kubelet[2446]: I0213 19:22:02.596838 2446 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 13 19:22:02.598256 kubelet[2446]: I0213 19:22:02.598243 2446 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 19:22:02.598256 kubelet[2446]: I0213 19:22:02.598256 2446 container_manager_linux.go:304] "Creating device plugin manager" Feb 13 19:22:02.598348 kubelet[2446]: I0213 19:22:02.598335 2446 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:22:02.601521 kubelet[2446]: I0213 19:22:02.601510 2446 kubelet.go:446] "Attempting to sync node with API server" Feb 13 19:22:02.601521 kubelet[2446]: I0213 19:22:02.601522 2446 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 19:22:02.602136 kubelet[2446]: I0213 19:22:02.601533 2446 kubelet.go:352] "Adding apiserver pod source" Feb 13 19:22:02.602136 kubelet[2446]: I0213 19:22:02.601539 2446 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 19:22:02.605749 kubelet[2446]: W0213 19:22:02.605695 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:02.605749 kubelet[2446]: E0213 19:22:02.605725 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:02.606166 kubelet[2446]: W0213 19:22:02.605870 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:02.606166 kubelet[2446]: E0213 19:22:02.605891 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:02.606959 kubelet[2446]: I0213 19:22:02.606870 2446 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 19:22:02.609046 kubelet[2446]: I0213 19:22:02.608960 2446 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 19:22:02.610079 kubelet[2446]: W0213 19:22:02.609831 2446 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 19:22:02.610176 kubelet[2446]: I0213 19:22:02.610140 2446 watchdog_linux.go:99] "Systemd watchdog is not enabled" Feb 13 19:22:02.610176 kubelet[2446]: I0213 19:22:02.610157 2446 server.go:1287] "Started kubelet" Feb 13 19:22:02.612432 kubelet[2446]: I0213 19:22:02.612249 2446 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 19:22:02.612747 kubelet[2446]: I0213 19:22:02.612720 2446 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 19:22:02.612896 kubelet[2446]: I0213 19:22:02.612885 2446 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 19:22:02.614805 kubelet[2446]: I0213 19:22:02.614797 2446 server.go:490] "Adding debug handlers to kubelet server" Feb 13 19:22:02.616036 kubelet[2446]: E0213 19:22:02.613804 2446 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.106:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.106:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1823dadcb8051d59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-02-13 19:22:02.610146649 +0000 UTC m=+0.566562737,LastTimestamp:2025-02-13 19:22:02.610146649 +0000 UTC m=+0.566562737,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Feb 13 19:22:02.617479 kubelet[2446]: I0213 19:22:02.617402 2446 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 19:22:02.618855 kubelet[2446]: I0213 19:22:02.618841 2446 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Feb 13 19:22:02.620785 kubelet[2446]: I0213 19:22:02.620777 2446 volume_manager.go:297] "Starting Kubelet Volume Manager" Feb 13 19:22:02.621106 kubelet[2446]: E0213 19:22:02.620932 2446 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:22:02.622924 kubelet[2446]: E0213 19:22:02.622908 2446 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="200ms" Feb 13 19:22:02.623564 kubelet[2446]: I0213 19:22:02.623556 2446 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 19:22:02.623684 kubelet[2446]: I0213 19:22:02.623628 2446 reconciler.go:26] "Reconciler: start to sync state" Feb 13 19:22:02.624428 kubelet[2446]: W0213 19:22:02.624409 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:02.624827 kubelet[2446]: E0213 19:22:02.624476 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:02.624827 kubelet[2446]: I0213 19:22:02.624639 2446 factory.go:221] Registration of the systemd container factory successfully Feb 13 19:22:02.624827 kubelet[2446]: I0213 19:22:02.624680 2446 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 19:22:02.626672 kubelet[2446]: I0213 19:22:02.626663 2446 factory.go:221] Registration of the containerd container factory successfully Feb 13 19:22:02.631828 kubelet[2446]: I0213 19:22:02.631802 2446 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 19:22:02.632909 kubelet[2446]: I0213 19:22:02.632900 2446 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 19:22:02.632952 kubelet[2446]: I0213 19:22:02.632948 2446 status_manager.go:227] "Starting to sync pod status with apiserver" Feb 13 19:22:02.632986 kubelet[2446]: I0213 19:22:02.632982 2446 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Feb 13 19:22:02.633024 kubelet[2446]: I0213 19:22:02.633019 2446 kubelet.go:2388] "Starting kubelet main sync loop" Feb 13 19:22:02.633083 kubelet[2446]: E0213 19:22:02.633074 2446 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 19:22:02.641918 kubelet[2446]: W0213 19:22:02.639831 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:02.641918 kubelet[2446]: E0213 19:22:02.641858 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:02.643651 kubelet[2446]: I0213 19:22:02.643564 2446 cpu_manager.go:221] "Starting CPU manager" policy="none" Feb 13 19:22:02.643651 kubelet[2446]: I0213 19:22:02.643572 2446 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Feb 13 19:22:02.643651 kubelet[2446]: I0213 19:22:02.643580 2446 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:22:02.644915 kubelet[2446]: I0213 19:22:02.644903 2446 policy_none.go:49] "None policy: Start" Feb 13 19:22:02.644915 kubelet[2446]: I0213 19:22:02.644915 2446 memory_manager.go:186] "Starting memorymanager" policy="None" Feb 13 19:22:02.644964 kubelet[2446]: I0213 19:22:02.644920 2446 state_mem.go:35] "Initializing new in-memory state store" Feb 13 19:22:02.648340 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 19:22:02.658052 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 19:22:02.660522 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 19:22:02.668439 kubelet[2446]: I0213 19:22:02.668419 2446 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 19:22:02.668715 kubelet[2446]: I0213 19:22:02.668702 2446 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 13 19:22:02.668747 kubelet[2446]: I0213 19:22:02.668717 2446 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 19:22:02.669867 kubelet[2446]: I0213 19:22:02.669780 2446 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 19:22:02.670467 kubelet[2446]: E0213 19:22:02.670099 2446 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Feb 13 19:22:02.670467 kubelet[2446]: E0213 19:22:02.670121 2446 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Feb 13 19:22:02.741125 systemd[1]: Created slice kubepods-burstable-podc72911152bbceda2f57fd8d59261e015.slice - libcontainer container kubepods-burstable-podc72911152bbceda2f57fd8d59261e015.slice. Feb 13 19:22:02.750421 kubelet[2446]: E0213 19:22:02.750404 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:02.753237 systemd[1]: Created slice kubepods-burstable-pod95ef9ac46cd4dbaadc63cb713310ae59.slice - libcontainer container kubepods-burstable-pod95ef9ac46cd4dbaadc63cb713310ae59.slice. Feb 13 19:22:02.759769 kubelet[2446]: E0213 19:22:02.759752 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:02.762448 systemd[1]: Created slice kubepods-burstable-pod49e6ef414e3fdb457d1696f74c728571.slice - libcontainer container kubepods-burstable-pod49e6ef414e3fdb457d1696f74c728571.slice. Feb 13 19:22:02.763777 kubelet[2446]: E0213 19:22:02.763758 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:02.769678 kubelet[2446]: I0213 19:22:02.769661 2446 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 19:22:02.769943 kubelet[2446]: E0213 19:22:02.769914 2446 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 19:22:02.823448 kubelet[2446]: E0213 19:22:02.823387 2446 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="400ms" Feb 13 19:22:02.824998 kubelet[2446]: I0213 19:22:02.824705 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/49e6ef414e3fdb457d1696f74c728571-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"49e6ef414e3fdb457d1696f74c728571\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:02.824998 kubelet[2446]: I0213 19:22:02.824726 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/49e6ef414e3fdb457d1696f74c728571-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"49e6ef414e3fdb457d1696f74c728571\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:02.824998 kubelet[2446]: I0213 19:22:02.824744 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:02.824998 kubelet[2446]: I0213 19:22:02.824769 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:02.824998 kubelet[2446]: I0213 19:22:02.824806 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:02.825176 kubelet[2446]: I0213 19:22:02.824843 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/95ef9ac46cd4dbaadc63cb713310ae59-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"95ef9ac46cd4dbaadc63cb713310ae59\") " pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:02.825176 kubelet[2446]: I0213 19:22:02.824867 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/49e6ef414e3fdb457d1696f74c728571-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"49e6ef414e3fdb457d1696f74c728571\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:02.825176 kubelet[2446]: I0213 19:22:02.824881 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:02.825176 kubelet[2446]: I0213 19:22:02.824919 2446 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:02.970792 kubelet[2446]: I0213 19:22:02.970771 2446 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 19:22:02.971052 kubelet[2446]: E0213 19:22:02.971023 2446 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 19:22:03.052452 containerd[1558]: time="2025-02-13T19:22:03.052350220Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:c72911152bbceda2f57fd8d59261e015,Namespace:kube-system,Attempt:0,}" Feb 13 19:22:03.061074 containerd[1558]: time="2025-02-13T19:22:03.061045026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:95ef9ac46cd4dbaadc63cb713310ae59,Namespace:kube-system,Attempt:0,}" Feb 13 19:22:03.064761 containerd[1558]: time="2025-02-13T19:22:03.064714194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:49e6ef414e3fdb457d1696f74c728571,Namespace:kube-system,Attempt:0,}" Feb 13 19:22:03.224700 kubelet[2446]: E0213 19:22:03.224671 2446 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="800ms" Feb 13 19:22:03.372853 kubelet[2446]: I0213 19:22:03.372775 2446 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 19:22:03.373036 kubelet[2446]: E0213 19:22:03.373017 2446 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 19:22:03.522324 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount210484804.mount: Deactivated successfully. Feb 13 19:22:03.524157 containerd[1558]: time="2025-02-13T19:22:03.524073575Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:22:03.524838 containerd[1558]: time="2025-02-13T19:22:03.524577766Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:22:03.525282 containerd[1558]: time="2025-02-13T19:22:03.525266746Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:22:03.525525 containerd[1558]: time="2025-02-13T19:22:03.525503198Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 19:22:03.525833 containerd[1558]: time="2025-02-13T19:22:03.525804068Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 19:22:03.526033 containerd[1558]: time="2025-02-13T19:22:03.526021218Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:22:03.526212 containerd[1558]: time="2025-02-13T19:22:03.526194951Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 19:22:03.527591 containerd[1558]: time="2025-02-13T19:22:03.527560646Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:22:03.528917 containerd[1558]: time="2025-02-13T19:22:03.528902807Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 464.141815ms" Feb 13 19:22:03.530007 containerd[1558]: time="2025-02-13T19:22:03.529947784Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 477.539237ms" Feb 13 19:22:03.549518 containerd[1558]: time="2025-02-13T19:22:03.549406717Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 488.301949ms" Feb 13 19:22:03.677556 containerd[1558]: time="2025-02-13T19:22:03.676225620Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:03.677556 containerd[1558]: time="2025-02-13T19:22:03.677510350Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:03.677556 containerd[1558]: time="2025-02-13T19:22:03.677521173Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:03.677860 containerd[1558]: time="2025-02-13T19:22:03.677568414Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:03.684421 containerd[1558]: time="2025-02-13T19:22:03.684365834Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:03.685122 containerd[1558]: time="2025-02-13T19:22:03.684926304Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:03.685122 containerd[1558]: time="2025-02-13T19:22:03.684936497Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:03.685122 containerd[1558]: time="2025-02-13T19:22:03.684974985Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:03.685658 containerd[1558]: time="2025-02-13T19:22:03.684875452Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:03.685658 containerd[1558]: time="2025-02-13T19:22:03.684910720Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:03.685658 containerd[1558]: time="2025-02-13T19:22:03.684922329Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:03.685658 containerd[1558]: time="2025-02-13T19:22:03.684967207Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:03.691244 kubelet[2446]: W0213 19:22:03.691119 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:03.691244 kubelet[2446]: E0213 19:22:03.691157 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:03.696912 systemd[1]: Started cri-containerd-3c61eb964da0fad105eedfa0ccef7751acb72f76430019c6543f8a88c08b69c2.scope - libcontainer container 3c61eb964da0fad105eedfa0ccef7751acb72f76430019c6543f8a88c08b69c2. Feb 13 19:22:03.697803 kubelet[2446]: W0213 19:22:03.697588 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:03.697803 kubelet[2446]: E0213 19:22:03.697610 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:03.700233 systemd[1]: Started cri-containerd-79afddde1ac42f7aedcfa23cba3e9aa2689a645d0541e7cda2b3f7b172c4c962.scope - libcontainer container 79afddde1ac42f7aedcfa23cba3e9aa2689a645d0541e7cda2b3f7b172c4c962. Feb 13 19:22:03.711902 systemd[1]: Started cri-containerd-83d6ff6b40683277bf9154f3822dd4d10537b501af361f6204c415a18b96b42a.scope - libcontainer container 83d6ff6b40683277bf9154f3822dd4d10537b501af361f6204c415a18b96b42a. Feb 13 19:22:03.745144 containerd[1558]: time="2025-02-13T19:22:03.745097115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:95ef9ac46cd4dbaadc63cb713310ae59,Namespace:kube-system,Attempt:0,} returns sandbox id \"83d6ff6b40683277bf9154f3822dd4d10537b501af361f6204c415a18b96b42a\"" Feb 13 19:22:03.746926 containerd[1558]: time="2025-02-13T19:22:03.746881012Z" level=info msg="CreateContainer within sandbox \"83d6ff6b40683277bf9154f3822dd4d10537b501af361f6204c415a18b96b42a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 19:22:03.748943 containerd[1558]: time="2025-02-13T19:22:03.748927340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:c72911152bbceda2f57fd8d59261e015,Namespace:kube-system,Attempt:0,} returns sandbox id \"3c61eb964da0fad105eedfa0ccef7751acb72f76430019c6543f8a88c08b69c2\"" Feb 13 19:22:03.750977 containerd[1558]: time="2025-02-13T19:22:03.750369328Z" level=info msg="CreateContainer within sandbox \"3c61eb964da0fad105eedfa0ccef7751acb72f76430019c6543f8a88c08b69c2\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 19:22:03.751087 containerd[1558]: time="2025-02-13T19:22:03.751071978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:49e6ef414e3fdb457d1696f74c728571,Namespace:kube-system,Attempt:0,} returns sandbox id \"79afddde1ac42f7aedcfa23cba3e9aa2689a645d0541e7cda2b3f7b172c4c962\"" Feb 13 19:22:03.753199 containerd[1558]: time="2025-02-13T19:22:03.753184704Z" level=info msg="CreateContainer within sandbox \"79afddde1ac42f7aedcfa23cba3e9aa2689a645d0541e7cda2b3f7b172c4c962\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 19:22:03.761175 containerd[1558]: time="2025-02-13T19:22:03.761136999Z" level=info msg="CreateContainer within sandbox \"83d6ff6b40683277bf9154f3822dd4d10537b501af361f6204c415a18b96b42a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"78741682c18133c84e54f72b438ef73dbd161c3d5c71a3e58a0ac0053fb3db27\"" Feb 13 19:22:03.761921 containerd[1558]: time="2025-02-13T19:22:03.761884413Z" level=info msg="StartContainer for \"78741682c18133c84e54f72b438ef73dbd161c3d5c71a3e58a0ac0053fb3db27\"" Feb 13 19:22:03.762327 containerd[1558]: time="2025-02-13T19:22:03.762258560Z" level=info msg="CreateContainer within sandbox \"79afddde1ac42f7aedcfa23cba3e9aa2689a645d0541e7cda2b3f7b172c4c962\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"ccb90a5ce994cbb599237b8d5b4deae8bc39c34605de3c883ec17cae540f4402\"" Feb 13 19:22:03.763781 containerd[1558]: time="2025-02-13T19:22:03.763762159Z" level=info msg="CreateContainer within sandbox \"3c61eb964da0fad105eedfa0ccef7751acb72f76430019c6543f8a88c08b69c2\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b9c051506551e3c42e282076cf25e4ad9bdeb4dd7d095608ff54e2c51cb8c21a\"" Feb 13 19:22:03.763908 containerd[1558]: time="2025-02-13T19:22:03.763890587Z" level=info msg="StartContainer for \"ccb90a5ce994cbb599237b8d5b4deae8bc39c34605de3c883ec17cae540f4402\"" Feb 13 19:22:03.765533 containerd[1558]: time="2025-02-13T19:22:03.765475057Z" level=info msg="StartContainer for \"b9c051506551e3c42e282076cf25e4ad9bdeb4dd7d095608ff54e2c51cb8c21a\"" Feb 13 19:22:03.787935 systemd[1]: Started cri-containerd-78741682c18133c84e54f72b438ef73dbd161c3d5c71a3e58a0ac0053fb3db27.scope - libcontainer container 78741682c18133c84e54f72b438ef73dbd161c3d5c71a3e58a0ac0053fb3db27. Feb 13 19:22:03.794902 systemd[1]: Started cri-containerd-b9c051506551e3c42e282076cf25e4ad9bdeb4dd7d095608ff54e2c51cb8c21a.scope - libcontainer container b9c051506551e3c42e282076cf25e4ad9bdeb4dd7d095608ff54e2c51cb8c21a. Feb 13 19:22:03.796045 systemd[1]: Started cri-containerd-ccb90a5ce994cbb599237b8d5b4deae8bc39c34605de3c883ec17cae540f4402.scope - libcontainer container ccb90a5ce994cbb599237b8d5b4deae8bc39c34605de3c883ec17cae540f4402. Feb 13 19:22:03.835722 containerd[1558]: time="2025-02-13T19:22:03.835631250Z" level=info msg="StartContainer for \"78741682c18133c84e54f72b438ef73dbd161c3d5c71a3e58a0ac0053fb3db27\" returns successfully" Feb 13 19:22:03.835722 containerd[1558]: time="2025-02-13T19:22:03.835691051Z" level=info msg="StartContainer for \"b9c051506551e3c42e282076cf25e4ad9bdeb4dd7d095608ff54e2c51cb8c21a\" returns successfully" Feb 13 19:22:03.841599 containerd[1558]: time="2025-02-13T19:22:03.841572213Z" level=info msg="StartContainer for \"ccb90a5ce994cbb599237b8d5b4deae8bc39c34605de3c883ec17cae540f4402\" returns successfully" Feb 13 19:22:03.864122 kubelet[2446]: W0213 19:22:03.864087 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:03.864211 kubelet[2446]: E0213 19:22:03.864127 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:04.025470 kubelet[2446]: E0213 19:22:04.025429 2446 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="1.6s" Feb 13 19:22:04.079231 kubelet[2446]: W0213 19:22:04.079178 2446 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 19:22:04.079231 kubelet[2446]: E0213 19:22:04.079214 2446 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.106:6443: connect: connection refused" logger="UnhandledError" Feb 13 19:22:04.175082 kubelet[2446]: I0213 19:22:04.175046 2446 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 19:22:04.175347 kubelet[2446]: E0213 19:22:04.175332 2446 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 19:22:04.648143 kubelet[2446]: E0213 19:22:04.648125 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:04.649021 kubelet[2446]: E0213 19:22:04.649013 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:04.651314 kubelet[2446]: E0213 19:22:04.651306 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:05.517661 kubelet[2446]: E0213 19:22:05.517636 2446 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Feb 13 19:22:05.627470 kubelet[2446]: E0213 19:22:05.627404 2446 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Feb 13 19:22:05.652428 kubelet[2446]: E0213 19:22:05.652334 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:05.652671 kubelet[2446]: E0213 19:22:05.652610 2446 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 19:22:05.776549 kubelet[2446]: I0213 19:22:05.776454 2446 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 19:22:05.786545 kubelet[2446]: I0213 19:22:05.786463 2446 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Feb 13 19:22:05.786545 kubelet[2446]: E0213 19:22:05.786484 2446 kubelet_node_status.go:549] "Error updating node status, will retry" err="error getting node \"localhost\": node \"localhost\" not found" Feb 13 19:22:05.788589 kubelet[2446]: E0213 19:22:05.788573 2446 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:22:05.889669 kubelet[2446]: E0213 19:22:05.889643 2446 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:22:05.990056 kubelet[2446]: E0213 19:22:05.989996 2446 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:22:06.121978 kubelet[2446]: I0213 19:22:06.121900 2446 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:06.131660 kubelet[2446]: I0213 19:22:06.131633 2446 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:06.135338 kubelet[2446]: I0213 19:22:06.134788 2446 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:06.605653 kubelet[2446]: I0213 19:22:06.605494 2446 apiserver.go:52] "Watching apiserver" Feb 13 19:22:06.625136 kubelet[2446]: I0213 19:22:06.625084 2446 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 19:22:06.652551 kubelet[2446]: I0213 19:22:06.652532 2446 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:06.655719 kubelet[2446]: E0213 19:22:06.655688 2446 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:07.247953 systemd[1]: Reload requested from client PID 2724 ('systemctl') (unit session-9.scope)... Feb 13 19:22:07.247969 systemd[1]: Reloading... Feb 13 19:22:07.318854 zram_generator::config[2781]: No configuration found. Feb 13 19:22:07.369835 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:22:07.389267 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:22:07.462479 systemd[1]: Reloading finished in 214 ms. Feb 13 19:22:07.484197 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:22:07.501629 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 19:22:07.501777 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:22:07.501810 systemd[1]: kubelet.service: Consumed 538ms CPU time, 124.4M memory peak. Feb 13 19:22:07.506128 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:22:07.675095 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:22:07.678794 (kubelet)[2835]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 19:22:07.729864 kubelet[2835]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:22:07.730086 kubelet[2835]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Feb 13 19:22:07.730128 kubelet[2835]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:22:07.730228 kubelet[2835]: I0213 19:22:07.730204 2835 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 19:22:07.734300 kubelet[2835]: I0213 19:22:07.734282 2835 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Feb 13 19:22:07.734300 kubelet[2835]: I0213 19:22:07.734299 2835 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 19:22:07.734448 kubelet[2835]: I0213 19:22:07.734436 2835 server.go:954] "Client rotation is on, will bootstrap in background" Feb 13 19:22:07.735184 kubelet[2835]: I0213 19:22:07.735170 2835 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 19:22:07.736736 kubelet[2835]: I0213 19:22:07.736659 2835 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 19:22:07.764867 kubelet[2835]: E0213 19:22:07.764779 2835 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Feb 13 19:22:07.764867 kubelet[2835]: I0213 19:22:07.764809 2835 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Feb 13 19:22:07.768009 kubelet[2835]: I0213 19:22:07.767944 2835 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 19:22:07.768587 kubelet[2835]: I0213 19:22:07.768086 2835 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 19:22:07.768587 kubelet[2835]: I0213 19:22:07.768118 2835 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 13 19:22:07.768587 kubelet[2835]: I0213 19:22:07.768305 2835 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 19:22:07.768587 kubelet[2835]: I0213 19:22:07.768312 2835 container_manager_linux.go:304] "Creating device plugin manager" Feb 13 19:22:07.768736 kubelet[2835]: I0213 19:22:07.768340 2835 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:22:07.777192 kubelet[2835]: I0213 19:22:07.776577 2835 kubelet.go:446] "Attempting to sync node with API server" Feb 13 19:22:07.777192 kubelet[2835]: I0213 19:22:07.776720 2835 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 19:22:07.777192 kubelet[2835]: I0213 19:22:07.776736 2835 kubelet.go:352] "Adding apiserver pod source" Feb 13 19:22:07.777192 kubelet[2835]: I0213 19:22:07.776743 2835 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 19:22:07.780636 kubelet[2835]: I0213 19:22:07.780625 2835 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 19:22:07.781004 kubelet[2835]: I0213 19:22:07.780994 2835 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 19:22:07.781361 kubelet[2835]: I0213 19:22:07.781351 2835 watchdog_linux.go:99] "Systemd watchdog is not enabled" Feb 13 19:22:07.781442 kubelet[2835]: I0213 19:22:07.781435 2835 server.go:1287] "Started kubelet" Feb 13 19:22:07.785436 kubelet[2835]: I0213 19:22:07.785424 2835 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 19:22:07.789369 kubelet[2835]: I0213 19:22:07.789298 2835 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 19:22:07.789508 kubelet[2835]: I0213 19:22:07.789494 2835 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 19:22:07.789536 kubelet[2835]: I0213 19:22:07.789527 2835 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 19:22:07.790738 kubelet[2835]: I0213 19:22:07.790318 2835 server.go:490] "Adding debug handlers to kubelet server" Feb 13 19:22:07.797831 kubelet[2835]: I0213 19:22:07.797780 2835 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Feb 13 19:22:07.799297 kubelet[2835]: I0213 19:22:07.799284 2835 volume_manager.go:297] "Starting Kubelet Volume Manager" Feb 13 19:22:07.805217 kubelet[2835]: I0213 19:22:07.804861 2835 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 19:22:07.805217 kubelet[2835]: I0213 19:22:07.804936 2835 reconciler.go:26] "Reconciler: start to sync state" Feb 13 19:22:07.805329 kubelet[2835]: I0213 19:22:07.805317 2835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 19:22:07.805957 kubelet[2835]: I0213 19:22:07.805949 2835 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 19:22:07.807372 kubelet[2835]: I0213 19:22:07.807357 2835 factory.go:221] Registration of the systemd container factory successfully Feb 13 19:22:07.807432 kubelet[2835]: I0213 19:22:07.807418 2835 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 19:22:07.810110 kubelet[2835]: I0213 19:22:07.809972 2835 status_manager.go:227] "Starting to sync pod status with apiserver" Feb 13 19:22:07.810110 kubelet[2835]: I0213 19:22:07.809991 2835 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Feb 13 19:22:07.810110 kubelet[2835]: I0213 19:22:07.809995 2835 kubelet.go:2388] "Starting kubelet main sync loop" Feb 13 19:22:07.810110 kubelet[2835]: E0213 19:22:07.810020 2835 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 19:22:07.810300 kubelet[2835]: I0213 19:22:07.810271 2835 factory.go:221] Registration of the containerd container factory successfully Feb 13 19:22:07.850739 kubelet[2835]: I0213 19:22:07.850722 2835 cpu_manager.go:221] "Starting CPU manager" policy="none" Feb 13 19:22:07.850739 kubelet[2835]: I0213 19:22:07.850733 2835 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Feb 13 19:22:07.850865 kubelet[2835]: I0213 19:22:07.850769 2835 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:22:07.850884 kubelet[2835]: I0213 19:22:07.850875 2835 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 19:22:07.850902 kubelet[2835]: I0213 19:22:07.850882 2835 state_mem.go:96] "Updated CPUSet assignments" assignments={} Feb 13 19:22:07.850902 kubelet[2835]: I0213 19:22:07.850894 2835 policy_none.go:49] "None policy: Start" Feb 13 19:22:07.850902 kubelet[2835]: I0213 19:22:07.850899 2835 memory_manager.go:186] "Starting memorymanager" policy="None" Feb 13 19:22:07.850951 kubelet[2835]: I0213 19:22:07.850905 2835 state_mem.go:35] "Initializing new in-memory state store" Feb 13 19:22:07.850993 kubelet[2835]: I0213 19:22:07.850982 2835 state_mem.go:75] "Updated machine memory state" Feb 13 19:22:07.853482 kubelet[2835]: I0213 19:22:07.853468 2835 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 19:22:07.853578 kubelet[2835]: I0213 19:22:07.853567 2835 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 13 19:22:07.853602 kubelet[2835]: I0213 19:22:07.853577 2835 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 19:22:07.854106 kubelet[2835]: I0213 19:22:07.854092 2835 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 19:22:07.854781 kubelet[2835]: E0213 19:22:07.854767 2835 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Feb 13 19:22:07.911859 kubelet[2835]: I0213 19:22:07.910969 2835 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:07.918181 kubelet[2835]: I0213 19:22:07.918163 2835 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:07.918393 kubelet[2835]: I0213 19:22:07.918376 2835 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:07.918572 kubelet[2835]: E0213 19:22:07.918537 2835 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:07.921568 kubelet[2835]: E0213 19:22:07.921527 2835 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:07.922001 kubelet[2835]: E0213 19:22:07.921985 2835 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:07.957899 kubelet[2835]: I0213 19:22:07.957845 2835 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 19:22:07.975219 kubelet[2835]: I0213 19:22:07.975196 2835 kubelet_node_status.go:125] "Node was previously registered" node="localhost" Feb 13 19:22:07.975312 kubelet[2835]: I0213 19:22:07.975252 2835 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Feb 13 19:22:08.005668 kubelet[2835]: I0213 19:22:08.005646 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/49e6ef414e3fdb457d1696f74c728571-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"49e6ef414e3fdb457d1696f74c728571\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:08.005758 kubelet[2835]: I0213 19:22:08.005750 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:08.005785 kubelet[2835]: I0213 19:22:08.005768 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:08.005810 kubelet[2835]: I0213 19:22:08.005781 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/49e6ef414e3fdb457d1696f74c728571-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"49e6ef414e3fdb457d1696f74c728571\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:08.005810 kubelet[2835]: I0213 19:22:08.005806 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/49e6ef414e3fdb457d1696f74c728571-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"49e6ef414e3fdb457d1696f74c728571\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:08.005893 kubelet[2835]: I0213 19:22:08.005831 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:08.005893 kubelet[2835]: I0213 19:22:08.005855 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:08.005893 kubelet[2835]: I0213 19:22:08.005868 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:22:08.005893 kubelet[2835]: I0213 19:22:08.005879 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/95ef9ac46cd4dbaadc63cb713310ae59-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"95ef9ac46cd4dbaadc63cb713310ae59\") " pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:08.777505 kubelet[2835]: I0213 19:22:08.777468 2835 apiserver.go:52] "Watching apiserver" Feb 13 19:22:08.805863 kubelet[2835]: I0213 19:22:08.805850 2835 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 19:22:08.838837 kubelet[2835]: I0213 19:22:08.838806 2835 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:08.839517 kubelet[2835]: I0213 19:22:08.839339 2835 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:08.842350 kubelet[2835]: E0213 19:22:08.842336 2835 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" Feb 13 19:22:08.842735 kubelet[2835]: E0213 19:22:08.842725 2835 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Feb 13 19:22:08.860003 kubelet[2835]: I0213 19:22:08.859945 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=2.859932079 podStartE2EDuration="2.859932079s" podCreationTimestamp="2025-02-13 19:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:22:08.855574327 +0000 UTC m=+1.164774965" watchObservedRunningTime="2025-02-13 19:22:08.859932079 +0000 UTC m=+1.169132724" Feb 13 19:22:08.864079 kubelet[2835]: I0213 19:22:08.864051 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=2.864040523 podStartE2EDuration="2.864040523s" podCreationTimestamp="2025-02-13 19:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:22:08.860151499 +0000 UTC m=+1.169352138" watchObservedRunningTime="2025-02-13 19:22:08.864040523 +0000 UTC m=+1.173241159" Feb 13 19:22:08.869060 kubelet[2835]: I0213 19:22:08.869030 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=2.869020317 podStartE2EDuration="2.869020317s" podCreationTimestamp="2025-02-13 19:22:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:22:08.864507923 +0000 UTC m=+1.173708567" watchObservedRunningTime="2025-02-13 19:22:08.869020317 +0000 UTC m=+1.178220976" Feb 13 19:22:12.295268 sudo[1876]: pam_unix(sudo:session): session closed for user root Feb 13 19:22:12.295930 sshd[1875]: Connection closed by 139.178.89.65 port 53382 Feb 13 19:22:12.296632 sshd-session[1872]: pam_unix(sshd:session): session closed for user core Feb 13 19:22:12.298544 systemd[1]: sshd@6-139.178.70.106:22-139.178.89.65:53382.service: Deactivated successfully. Feb 13 19:22:12.300282 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 19:22:12.300454 systemd[1]: session-9.scope: Consumed 2.919s CPU time, 146.7M memory peak. Feb 13 19:22:12.301599 systemd-logind[1542]: Session 9 logged out. Waiting for processes to exit. Feb 13 19:22:12.302631 systemd-logind[1542]: Removed session 9. Feb 13 19:22:14.101979 kubelet[2835]: I0213 19:22:14.101950 2835 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 19:22:14.102356 containerd[1558]: time="2025-02-13T19:22:14.102280107Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 19:22:14.102592 kubelet[2835]: I0213 19:22:14.102486 2835 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 19:22:14.781444 systemd[1]: Created slice kubepods-besteffort-podc75b487d_2cb2_44ae_a5f3_0c80e801b274.slice - libcontainer container kubepods-besteffort-podc75b487d_2cb2_44ae_a5f3_0c80e801b274.slice. Feb 13 19:22:14.852054 kubelet[2835]: I0213 19:22:14.852033 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c75b487d-2cb2-44ae-a5f3-0c80e801b274-kube-proxy\") pod \"kube-proxy-k465d\" (UID: \"c75b487d-2cb2-44ae-a5f3-0c80e801b274\") " pod="kube-system/kube-proxy-k465d" Feb 13 19:22:14.852146 kubelet[2835]: I0213 19:22:14.852085 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c75b487d-2cb2-44ae-a5f3-0c80e801b274-lib-modules\") pod \"kube-proxy-k465d\" (UID: \"c75b487d-2cb2-44ae-a5f3-0c80e801b274\") " pod="kube-system/kube-proxy-k465d" Feb 13 19:22:14.852146 kubelet[2835]: I0213 19:22:14.852102 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbncv\" (UniqueName: \"kubernetes.io/projected/c75b487d-2cb2-44ae-a5f3-0c80e801b274-kube-api-access-mbncv\") pod \"kube-proxy-k465d\" (UID: \"c75b487d-2cb2-44ae-a5f3-0c80e801b274\") " pod="kube-system/kube-proxy-k465d" Feb 13 19:22:14.852146 kubelet[2835]: I0213 19:22:14.852113 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c75b487d-2cb2-44ae-a5f3-0c80e801b274-xtables-lock\") pod \"kube-proxy-k465d\" (UID: \"c75b487d-2cb2-44ae-a5f3-0c80e801b274\") " pod="kube-system/kube-proxy-k465d" Feb 13 19:22:15.089684 systemd[1]: Created slice kubepods-besteffort-poda429d829_6b70_4601_b558_097736ef10bb.slice - libcontainer container kubepods-besteffort-poda429d829_6b70_4601_b558_097736ef10bb.slice. Feb 13 19:22:15.092240 containerd[1558]: time="2025-02-13T19:22:15.091545525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k465d,Uid:c75b487d-2cb2-44ae-a5f3-0c80e801b274,Namespace:kube-system,Attempt:0,}" Feb 13 19:22:15.128192 containerd[1558]: time="2025-02-13T19:22:15.128126011Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:15.128192 containerd[1558]: time="2025-02-13T19:22:15.128160426Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:15.128632 containerd[1558]: time="2025-02-13T19:22:15.128174402Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:15.128632 containerd[1558]: time="2025-02-13T19:22:15.128320962Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:15.141589 systemd[1]: run-containerd-runc-k8s.io-43dc64f59ce18c1c09b66f87be6d9e0260e834549087c47c84573a3843d315ee-runc.jK8l9O.mount: Deactivated successfully. Feb 13 19:22:15.148936 systemd[1]: Started cri-containerd-43dc64f59ce18c1c09b66f87be6d9e0260e834549087c47c84573a3843d315ee.scope - libcontainer container 43dc64f59ce18c1c09b66f87be6d9e0260e834549087c47c84573a3843d315ee. Feb 13 19:22:15.154766 kubelet[2835]: I0213 19:22:15.154740 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a429d829-6b70-4601-b558-097736ef10bb-var-lib-calico\") pod \"tigera-operator-7d68577dc5-vchcj\" (UID: \"a429d829-6b70-4601-b558-097736ef10bb\") " pod="tigera-operator/tigera-operator-7d68577dc5-vchcj" Feb 13 19:22:15.155266 kubelet[2835]: I0213 19:22:15.155071 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppndm\" (UniqueName: \"kubernetes.io/projected/a429d829-6b70-4601-b558-097736ef10bb-kube-api-access-ppndm\") pod \"tigera-operator-7d68577dc5-vchcj\" (UID: \"a429d829-6b70-4601-b558-097736ef10bb\") " pod="tigera-operator/tigera-operator-7d68577dc5-vchcj" Feb 13 19:22:15.163633 containerd[1558]: time="2025-02-13T19:22:15.163608219Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-k465d,Uid:c75b487d-2cb2-44ae-a5f3-0c80e801b274,Namespace:kube-system,Attempt:0,} returns sandbox id \"43dc64f59ce18c1c09b66f87be6d9e0260e834549087c47c84573a3843d315ee\"" Feb 13 19:22:15.165599 containerd[1558]: time="2025-02-13T19:22:15.165523856Z" level=info msg="CreateContainer within sandbox \"43dc64f59ce18c1c09b66f87be6d9e0260e834549087c47c84573a3843d315ee\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 19:22:15.207432 containerd[1558]: time="2025-02-13T19:22:15.207407518Z" level=info msg="CreateContainer within sandbox \"43dc64f59ce18c1c09b66f87be6d9e0260e834549087c47c84573a3843d315ee\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d5c8ee158102b11e5c8f176ec71278bcb478c99d8d38bf6be176ae76892bd055\"" Feb 13 19:22:15.208012 containerd[1558]: time="2025-02-13T19:22:15.207892950Z" level=info msg="StartContainer for \"d5c8ee158102b11e5c8f176ec71278bcb478c99d8d38bf6be176ae76892bd055\"" Feb 13 19:22:15.228949 systemd[1]: Started cri-containerd-d5c8ee158102b11e5c8f176ec71278bcb478c99d8d38bf6be176ae76892bd055.scope - libcontainer container d5c8ee158102b11e5c8f176ec71278bcb478c99d8d38bf6be176ae76892bd055. Feb 13 19:22:15.248049 containerd[1558]: time="2025-02-13T19:22:15.248024548Z" level=info msg="StartContainer for \"d5c8ee158102b11e5c8f176ec71278bcb478c99d8d38bf6be176ae76892bd055\" returns successfully" Feb 13 19:22:15.392091 containerd[1558]: time="2025-02-13T19:22:15.392010255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d68577dc5-vchcj,Uid:a429d829-6b70-4601-b558-097736ef10bb,Namespace:tigera-operator,Attempt:0,}" Feb 13 19:22:15.429601 containerd[1558]: time="2025-02-13T19:22:15.429458951Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:15.429601 containerd[1558]: time="2025-02-13T19:22:15.429492665Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:15.429601 containerd[1558]: time="2025-02-13T19:22:15.429501889Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:15.429601 containerd[1558]: time="2025-02-13T19:22:15.429552374Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:15.442955 systemd[1]: Started cri-containerd-2b6741538d0f3e935c92550f2ba8185735ef874905eb030154169e87ea0a2cf0.scope - libcontainer container 2b6741538d0f3e935c92550f2ba8185735ef874905eb030154169e87ea0a2cf0. Feb 13 19:22:15.470257 containerd[1558]: time="2025-02-13T19:22:15.470232198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d68577dc5-vchcj,Uid:a429d829-6b70-4601-b558-097736ef10bb,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"2b6741538d0f3e935c92550f2ba8185735ef874905eb030154169e87ea0a2cf0\"" Feb 13 19:22:15.471577 containerd[1558]: time="2025-02-13T19:22:15.471556807Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Feb 13 19:22:16.885502 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2327266944.mount: Deactivated successfully. Feb 13 19:22:17.352886 containerd[1558]: time="2025-02-13T19:22:17.352861735Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:17.353622 containerd[1558]: time="2025-02-13T19:22:17.353131838Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Feb 13 19:22:17.353622 containerd[1558]: time="2025-02-13T19:22:17.353601356Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:17.354792 containerd[1558]: time="2025-02-13T19:22:17.354763369Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:17.355466 containerd[1558]: time="2025-02-13T19:22:17.355225868Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 1.883526969s" Feb 13 19:22:17.355466 containerd[1558]: time="2025-02-13T19:22:17.355242455Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Feb 13 19:22:17.356929 containerd[1558]: time="2025-02-13T19:22:17.356807006Z" level=info msg="CreateContainer within sandbox \"2b6741538d0f3e935c92550f2ba8185735ef874905eb030154169e87ea0a2cf0\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 19:22:17.362731 containerd[1558]: time="2025-02-13T19:22:17.362703306Z" level=info msg="CreateContainer within sandbox \"2b6741538d0f3e935c92550f2ba8185735ef874905eb030154169e87ea0a2cf0\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ceb3bbfd915b51cd9383935cb49b390cd8b578c748eb615cd09eaceb02366ef9\"" Feb 13 19:22:17.363065 containerd[1558]: time="2025-02-13T19:22:17.362989020Z" level=info msg="StartContainer for \"ceb3bbfd915b51cd9383935cb49b390cd8b578c748eb615cd09eaceb02366ef9\"" Feb 13 19:22:17.390957 systemd[1]: Started cri-containerd-ceb3bbfd915b51cd9383935cb49b390cd8b578c748eb615cd09eaceb02366ef9.scope - libcontainer container ceb3bbfd915b51cd9383935cb49b390cd8b578c748eb615cd09eaceb02366ef9. Feb 13 19:22:17.407809 containerd[1558]: time="2025-02-13T19:22:17.407779249Z" level=info msg="StartContainer for \"ceb3bbfd915b51cd9383935cb49b390cd8b578c748eb615cd09eaceb02366ef9\" returns successfully" Feb 13 19:22:17.868778 kubelet[2835]: I0213 19:22:17.868729 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-k465d" podStartSLOduration=3.86372902 podStartE2EDuration="3.86372902s" podCreationTimestamp="2025-02-13 19:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:22:15.855957152 +0000 UTC m=+8.165157788" watchObservedRunningTime="2025-02-13 19:22:17.86372902 +0000 UTC m=+10.172929652" Feb 13 19:22:17.869832 kubelet[2835]: I0213 19:22:17.869170 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7d68577dc5-vchcj" podStartSLOduration=0.984387184 podStartE2EDuration="2.869145177s" podCreationTimestamp="2025-02-13 19:22:15 +0000 UTC" firstStartedPulling="2025-02-13 19:22:15.470935724 +0000 UTC m=+7.780136349" lastFinishedPulling="2025-02-13 19:22:17.355693714 +0000 UTC m=+9.664894342" observedRunningTime="2025-02-13 19:22:17.863580073 +0000 UTC m=+10.172780710" watchObservedRunningTime="2025-02-13 19:22:17.869145177 +0000 UTC m=+10.178345814" Feb 13 19:22:20.162018 systemd[1]: Created slice kubepods-besteffort-pod20db3966_965c_4317_895d_8eef2b64032d.slice - libcontainer container kubepods-besteffort-pod20db3966_965c_4317_895d_8eef2b64032d.slice. Feb 13 19:22:20.189214 kubelet[2835]: I0213 19:22:20.189136 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/20db3966-965c-4317-895d-8eef2b64032d-tigera-ca-bundle\") pod \"calico-typha-5dc7ccdf78-sn25p\" (UID: \"20db3966-965c-4317-895d-8eef2b64032d\") " pod="calico-system/calico-typha-5dc7ccdf78-sn25p" Feb 13 19:22:20.189214 kubelet[2835]: I0213 19:22:20.189163 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/20db3966-965c-4317-895d-8eef2b64032d-typha-certs\") pod \"calico-typha-5dc7ccdf78-sn25p\" (UID: \"20db3966-965c-4317-895d-8eef2b64032d\") " pod="calico-system/calico-typha-5dc7ccdf78-sn25p" Feb 13 19:22:20.189214 kubelet[2835]: I0213 19:22:20.189176 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44kqm\" (UniqueName: \"kubernetes.io/projected/20db3966-965c-4317-895d-8eef2b64032d-kube-api-access-44kqm\") pod \"calico-typha-5dc7ccdf78-sn25p\" (UID: \"20db3966-965c-4317-895d-8eef2b64032d\") " pod="calico-system/calico-typha-5dc7ccdf78-sn25p" Feb 13 19:22:20.252235 systemd[1]: Created slice kubepods-besteffort-pod045bebdc_a7aa_4f5c_8242_2b5c260d2490.slice - libcontainer container kubepods-besteffort-pod045bebdc_a7aa_4f5c_8242_2b5c260d2490.slice. Feb 13 19:22:20.289990 kubelet[2835]: I0213 19:22:20.289964 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-flexvol-driver-host\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.289990 kubelet[2835]: I0213 19:22:20.289989 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bq4q\" (UniqueName: \"kubernetes.io/projected/045bebdc-a7aa-4f5c-8242-2b5c260d2490-kube-api-access-8bq4q\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290137 kubelet[2835]: I0213 19:22:20.290009 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/045bebdc-a7aa-4f5c-8242-2b5c260d2490-node-certs\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290137 kubelet[2835]: I0213 19:22:20.290022 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-cni-bin-dir\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290137 kubelet[2835]: I0213 19:22:20.290041 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-cni-net-dir\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290137 kubelet[2835]: I0213 19:22:20.290051 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-policysync\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290137 kubelet[2835]: I0213 19:22:20.290067 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/045bebdc-a7aa-4f5c-8242-2b5c260d2490-tigera-ca-bundle\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290310 kubelet[2835]: I0213 19:22:20.290075 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-var-run-calico\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290310 kubelet[2835]: I0213 19:22:20.290085 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-lib-modules\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290310 kubelet[2835]: I0213 19:22:20.290096 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-xtables-lock\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290310 kubelet[2835]: I0213 19:22:20.290109 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-cni-log-dir\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.290310 kubelet[2835]: I0213 19:22:20.290127 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/045bebdc-a7aa-4f5c-8242-2b5c260d2490-var-lib-calico\") pod \"calico-node-ls26c\" (UID: \"045bebdc-a7aa-4f5c-8242-2b5c260d2490\") " pod="calico-system/calico-node-ls26c" Feb 13 19:22:20.355726 kubelet[2835]: E0213 19:22:20.354603 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:20.356756 kubelet[2835]: I0213 19:22:20.356635 2835 status_manager.go:890] "Failed to get status for pod" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" pod="calico-system/csi-node-driver-8xksv" err="pods \"csi-node-driver-8xksv\" is forbidden: User \"system:node:localhost\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'localhost' and this object" Feb 13 19:22:20.390320 kubelet[2835]: I0213 19:22:20.390296 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/d4951e2a-1949-44bc-afb1-457c1decf801-varrun\") pod \"csi-node-driver-8xksv\" (UID: \"d4951e2a-1949-44bc-afb1-457c1decf801\") " pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:20.391026 kubelet[2835]: I0213 19:22:20.390844 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d4951e2a-1949-44bc-afb1-457c1decf801-registration-dir\") pod \"csi-node-driver-8xksv\" (UID: \"d4951e2a-1949-44bc-afb1-457c1decf801\") " pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:20.391026 kubelet[2835]: I0213 19:22:20.390972 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d4951e2a-1949-44bc-afb1-457c1decf801-socket-dir\") pod \"csi-node-driver-8xksv\" (UID: \"d4951e2a-1949-44bc-afb1-457c1decf801\") " pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:20.391095 kubelet[2835]: I0213 19:22:20.391043 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d4951e2a-1949-44bc-afb1-457c1decf801-kubelet-dir\") pod \"csi-node-driver-8xksv\" (UID: \"d4951e2a-1949-44bc-afb1-457c1decf801\") " pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:20.391095 kubelet[2835]: I0213 19:22:20.391057 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8gt9\" (UniqueName: \"kubernetes.io/projected/d4951e2a-1949-44bc-afb1-457c1decf801-kube-api-access-s8gt9\") pod \"csi-node-driver-8xksv\" (UID: \"d4951e2a-1949-44bc-afb1-457c1decf801\") " pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:20.397811 kubelet[2835]: E0213 19:22:20.397288 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.397811 kubelet[2835]: W0213 19:22:20.397303 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.399722 kubelet[2835]: E0213 19:22:20.399703 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.404709 kubelet[2835]: E0213 19:22:20.404690 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.404709 kubelet[2835]: W0213 19:22:20.404703 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.404793 kubelet[2835]: E0213 19:22:20.404718 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.465064 containerd[1558]: time="2025-02-13T19:22:20.465037486Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dc7ccdf78-sn25p,Uid:20db3966-965c-4317-895d-8eef2b64032d,Namespace:calico-system,Attempt:0,}" Feb 13 19:22:20.485395 containerd[1558]: time="2025-02-13T19:22:20.484952828Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:20.485395 containerd[1558]: time="2025-02-13T19:22:20.484997135Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:20.485395 containerd[1558]: time="2025-02-13T19:22:20.485007256Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:20.485395 containerd[1558]: time="2025-02-13T19:22:20.485300208Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:20.492791 kubelet[2835]: E0213 19:22:20.492750 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.492791 kubelet[2835]: W0213 19:22:20.492764 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.492791 kubelet[2835]: E0213 19:22:20.492777 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.492928 kubelet[2835]: E0213 19:22:20.492903 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.492928 kubelet[2835]: W0213 19:22:20.492908 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.492928 kubelet[2835]: E0213 19:22:20.492913 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.493517 kubelet[2835]: E0213 19:22:20.493499 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.493517 kubelet[2835]: W0213 19:22:20.493507 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.493624 kubelet[2835]: E0213 19:22:20.493556 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.493876 kubelet[2835]: E0213 19:22:20.493776 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.493876 kubelet[2835]: W0213 19:22:20.493782 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.494068 kubelet[2835]: E0213 19:22:20.493802 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.494068 kubelet[2835]: E0213 19:22:20.494003 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.494068 kubelet[2835]: W0213 19:22:20.494008 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.494068 kubelet[2835]: E0213 19:22:20.494025 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.494298 kubelet[2835]: E0213 19:22:20.494241 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.494298 kubelet[2835]: W0213 19:22:20.494249 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.494298 kubelet[2835]: E0213 19:22:20.494259 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.495027 kubelet[2835]: E0213 19:22:20.494948 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.495027 kubelet[2835]: W0213 19:22:20.494955 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.495027 kubelet[2835]: E0213 19:22:20.494966 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.495202 kubelet[2835]: E0213 19:22:20.495082 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.495202 kubelet[2835]: W0213 19:22:20.495087 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.495202 kubelet[2835]: E0213 19:22:20.495093 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.495726 kubelet[2835]: E0213 19:22:20.495398 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.495726 kubelet[2835]: W0213 19:22:20.495405 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.495726 kubelet[2835]: E0213 19:22:20.495419 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.495900 kubelet[2835]: E0213 19:22:20.495833 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.495900 kubelet[2835]: W0213 19:22:20.495840 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.495900 kubelet[2835]: E0213 19:22:20.495879 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.496341 kubelet[2835]: E0213 19:22:20.496266 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.496341 kubelet[2835]: W0213 19:22:20.496272 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.496341 kubelet[2835]: E0213 19:22:20.496286 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.496566 kubelet[2835]: E0213 19:22:20.496499 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.496566 kubelet[2835]: W0213 19:22:20.496506 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.497020 kubelet[2835]: E0213 19:22:20.496921 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.497020 kubelet[2835]: E0213 19:22:20.496952 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.497020 kubelet[2835]: W0213 19:22:20.496957 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.497020 kubelet[2835]: E0213 19:22:20.496968 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.502323 kubelet[2835]: E0213 19:22:20.497368 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.502323 kubelet[2835]: W0213 19:22:20.497373 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.502323 kubelet[2835]: E0213 19:22:20.497382 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.502323 kubelet[2835]: E0213 19:22:20.498081 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.502323 kubelet[2835]: W0213 19:22:20.498087 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.502323 kubelet[2835]: E0213 19:22:20.498105 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.502323 kubelet[2835]: E0213 19:22:20.498212 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.502323 kubelet[2835]: W0213 19:22:20.498217 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.502323 kubelet[2835]: E0213 19:22:20.498229 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.502323 kubelet[2835]: E0213 19:22:20.498883 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.506747 kubelet[2835]: W0213 19:22:20.498889 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.506747 kubelet[2835]: E0213 19:22:20.498935 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.506747 kubelet[2835]: E0213 19:22:20.499009 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.506747 kubelet[2835]: W0213 19:22:20.499014 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.506747 kubelet[2835]: E0213 19:22:20.499097 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.506747 kubelet[2835]: W0213 19:22:20.499102 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.506747 kubelet[2835]: E0213 19:22:20.499108 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.506747 kubelet[2835]: E0213 19:22:20.499201 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.506747 kubelet[2835]: W0213 19:22:20.499206 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.506747 kubelet[2835]: E0213 19:22:20.499210 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.503367 systemd[1]: Started cri-containerd-f46947783f3e6b7e21ab4846e9c6047b3bf8b53fa3eb716aa02cc8cb7365164b.scope - libcontainer container f46947783f3e6b7e21ab4846e9c6047b3bf8b53fa3eb716aa02cc8cb7365164b. Feb 13 19:22:20.511311 kubelet[2835]: E0213 19:22:20.499346 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.511311 kubelet[2835]: E0213 19:22:20.499458 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.511311 kubelet[2835]: W0213 19:22:20.499464 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.511311 kubelet[2835]: E0213 19:22:20.499471 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.511311 kubelet[2835]: E0213 19:22:20.499558 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.511311 kubelet[2835]: W0213 19:22:20.499563 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.511311 kubelet[2835]: E0213 19:22:20.499568 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.511311 kubelet[2835]: E0213 19:22:20.501805 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.511311 kubelet[2835]: W0213 19:22:20.501858 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.511311 kubelet[2835]: E0213 19:22:20.501866 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.511480 kubelet[2835]: E0213 19:22:20.501983 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.511480 kubelet[2835]: W0213 19:22:20.501987 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.511480 kubelet[2835]: E0213 19:22:20.501993 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.511480 kubelet[2835]: E0213 19:22:20.502089 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.511480 kubelet[2835]: W0213 19:22:20.502093 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.511480 kubelet[2835]: E0213 19:22:20.502098 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.511480 kubelet[2835]: E0213 19:22:20.508766 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.511480 kubelet[2835]: W0213 19:22:20.508775 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.511480 kubelet[2835]: E0213 19:22:20.508789 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.539073 containerd[1558]: time="2025-02-13T19:22:20.539048215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5dc7ccdf78-sn25p,Uid:20db3966-965c-4317-895d-8eef2b64032d,Namespace:calico-system,Attempt:0,} returns sandbox id \"f46947783f3e6b7e21ab4846e9c6047b3bf8b53fa3eb716aa02cc8cb7365164b\"" Feb 13 19:22:20.540064 containerd[1558]: time="2025-02-13T19:22:20.540044772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Feb 13 19:22:20.563403 containerd[1558]: time="2025-02-13T19:22:20.563363999Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ls26c,Uid:045bebdc-a7aa-4f5c-8242-2b5c260d2490,Namespace:calico-system,Attempt:0,}" Feb 13 19:22:20.715760 containerd[1558]: time="2025-02-13T19:22:20.715643026Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:20.715760 containerd[1558]: time="2025-02-13T19:22:20.715687274Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:20.715760 containerd[1558]: time="2025-02-13T19:22:20.715697060Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:20.716625 containerd[1558]: time="2025-02-13T19:22:20.716270911Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:20.730961 systemd[1]: Started cri-containerd-3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76.scope - libcontainer container 3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76. Feb 13 19:22:20.748109 containerd[1558]: time="2025-02-13T19:22:20.748079992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-ls26c,Uid:045bebdc-a7aa-4f5c-8242-2b5c260d2490,Namespace:calico-system,Attempt:0,} returns sandbox id \"3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76\"" Feb 13 19:22:20.880467 kubelet[2835]: E0213 19:22:20.880414 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.880467 kubelet[2835]: W0213 19:22:20.880432 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.880467 kubelet[2835]: E0213 19:22:20.880449 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.880768 kubelet[2835]: E0213 19:22:20.880755 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.880768 kubelet[2835]: W0213 19:22:20.880766 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.882070 kubelet[2835]: E0213 19:22:20.880775 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.882070 kubelet[2835]: E0213 19:22:20.881073 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.882070 kubelet[2835]: W0213 19:22:20.881083 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.882070 kubelet[2835]: E0213 19:22:20.881091 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.882070 kubelet[2835]: E0213 19:22:20.881365 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.882070 kubelet[2835]: W0213 19:22:20.881373 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.882070 kubelet[2835]: E0213 19:22:20.881381 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.882070 kubelet[2835]: E0213 19:22:20.881759 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.882070 kubelet[2835]: W0213 19:22:20.881766 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.882070 kubelet[2835]: E0213 19:22:20.881773 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.982969 kubelet[2835]: E0213 19:22:20.982905 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.983359 kubelet[2835]: W0213 19:22:20.983340 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.983424 kubelet[2835]: E0213 19:22:20.983415 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.983595 kubelet[2835]: E0213 19:22:20.983590 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.983671 kubelet[2835]: W0213 19:22:20.983636 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.983671 kubelet[2835]: E0213 19:22:20.983652 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.983860 kubelet[2835]: E0213 19:22:20.983821 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.983860 kubelet[2835]: W0213 19:22:20.983827 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.983860 kubelet[2835]: E0213 19:22:20.983832 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.984129 kubelet[2835]: E0213 19:22:20.984056 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.984129 kubelet[2835]: W0213 19:22:20.984062 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.984129 kubelet[2835]: E0213 19:22:20.984067 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:20.984299 kubelet[2835]: E0213 19:22:20.984240 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:20.984299 kubelet[2835]: W0213 19:22:20.984246 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:20.984299 kubelet[2835]: E0213 19:22:20.984252 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:21.811380 kubelet[2835]: E0213 19:22:21.811006 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:22.711577 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2308034991.mount: Deactivated successfully. Feb 13 19:22:23.349831 containerd[1558]: time="2025-02-13T19:22:23.349793995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:23.351940 containerd[1558]: time="2025-02-13T19:22:23.351911857Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Feb 13 19:22:23.365426 containerd[1558]: time="2025-02-13T19:22:23.365367946Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:23.366011 containerd[1558]: time="2025-02-13T19:22:23.365997709Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.825935983s" Feb 13 19:22:23.366120 containerd[1558]: time="2025-02-13T19:22:23.366062645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Feb 13 19:22:23.366500 containerd[1558]: time="2025-02-13T19:22:23.366431427Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:23.386443 containerd[1558]: time="2025-02-13T19:22:23.386415154Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 19:22:23.396808 containerd[1558]: time="2025-02-13T19:22:23.396757703Z" level=info msg="CreateContainer within sandbox \"f46947783f3e6b7e21ab4846e9c6047b3bf8b53fa3eb716aa02cc8cb7365164b\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 19:22:23.435096 containerd[1558]: time="2025-02-13T19:22:23.435022811Z" level=info msg="CreateContainer within sandbox \"f46947783f3e6b7e21ab4846e9c6047b3bf8b53fa3eb716aa02cc8cb7365164b\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"5c19b744ea66a5823aa1e2e55a65d88114c102ac0a1230d2536dd39e2e9daef5\"" Feb 13 19:22:23.435496 containerd[1558]: time="2025-02-13T19:22:23.435377844Z" level=info msg="StartContainer for \"5c19b744ea66a5823aa1e2e55a65d88114c102ac0a1230d2536dd39e2e9daef5\"" Feb 13 19:22:23.489972 systemd[1]: Started cri-containerd-5c19b744ea66a5823aa1e2e55a65d88114c102ac0a1230d2536dd39e2e9daef5.scope - libcontainer container 5c19b744ea66a5823aa1e2e55a65d88114c102ac0a1230d2536dd39e2e9daef5. Feb 13 19:22:23.524619 containerd[1558]: time="2025-02-13T19:22:23.524534579Z" level=info msg="StartContainer for \"5c19b744ea66a5823aa1e2e55a65d88114c102ac0a1230d2536dd39e2e9daef5\" returns successfully" Feb 13 19:22:23.811331 kubelet[2835]: E0213 19:22:23.811054 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:23.908986 kubelet[2835]: E0213 19:22:23.908937 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.909478 kubelet[2835]: W0213 19:22:23.909258 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.909478 kubelet[2835]: E0213 19:22:23.909276 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.909804 kubelet[2835]: E0213 19:22:23.909681 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.909804 kubelet[2835]: W0213 19:22:23.909690 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.909804 kubelet[2835]: E0213 19:22:23.909700 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.909991 kubelet[2835]: E0213 19:22:23.909939 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.909991 kubelet[2835]: W0213 19:22:23.909948 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.909991 kubelet[2835]: E0213 19:22:23.909954 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.910248 kubelet[2835]: E0213 19:22:23.910173 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.910248 kubelet[2835]: W0213 19:22:23.910181 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.910248 kubelet[2835]: E0213 19:22:23.910187 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.910364 kubelet[2835]: E0213 19:22:23.910357 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.910399 kubelet[2835]: W0213 19:22:23.910394 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.910490 kubelet[2835]: E0213 19:22:23.910437 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.910564 kubelet[2835]: E0213 19:22:23.910557 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.910601 kubelet[2835]: W0213 19:22:23.910595 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.910710 kubelet[2835]: E0213 19:22:23.910618 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.910774 kubelet[2835]: E0213 19:22:23.910768 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.910810 kubelet[2835]: W0213 19:22:23.910805 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916502 kubelet[2835]: E0213 19:22:23.910909 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916502 kubelet[2835]: E0213 19:22:23.911024 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916502 kubelet[2835]: W0213 19:22:23.911030 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916502 kubelet[2835]: E0213 19:22:23.911041 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916502 kubelet[2835]: E0213 19:22:23.911152 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916502 kubelet[2835]: W0213 19:22:23.911158 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916502 kubelet[2835]: E0213 19:22:23.911168 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916502 kubelet[2835]: E0213 19:22:23.911273 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916502 kubelet[2835]: W0213 19:22:23.911279 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916502 kubelet[2835]: E0213 19:22:23.911290 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916682 kubelet[2835]: E0213 19:22:23.911437 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916682 kubelet[2835]: W0213 19:22:23.911445 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916682 kubelet[2835]: E0213 19:22:23.911451 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916682 kubelet[2835]: E0213 19:22:23.911564 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916682 kubelet[2835]: W0213 19:22:23.911576 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916682 kubelet[2835]: E0213 19:22:23.911582 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916682 kubelet[2835]: E0213 19:22:23.911694 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916682 kubelet[2835]: W0213 19:22:23.911700 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916682 kubelet[2835]: E0213 19:22:23.911711 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916682 kubelet[2835]: E0213 19:22:23.911834 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916880 kubelet[2835]: W0213 19:22:23.911841 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916880 kubelet[2835]: E0213 19:22:23.911846 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.916880 kubelet[2835]: E0213 19:22:23.911955 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.916880 kubelet[2835]: W0213 19:22:23.911961 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.916880 kubelet[2835]: E0213 19:22:23.911973 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.920570 kubelet[2835]: I0213 19:22:23.920329 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5dc7ccdf78-sn25p" podStartSLOduration=1.07363067 podStartE2EDuration="3.920231043s" podCreationTimestamp="2025-02-13 19:22:20 +0000 UTC" firstStartedPulling="2025-02-13 19:22:20.5397117 +0000 UTC m=+12.848912324" lastFinishedPulling="2025-02-13 19:22:23.386312067 +0000 UTC m=+15.695512697" observedRunningTime="2025-02-13 19:22:23.919719482 +0000 UTC m=+16.228920121" watchObservedRunningTime="2025-02-13 19:22:23.920231043 +0000 UTC m=+16.229431671" Feb 13 19:22:23.921434 kubelet[2835]: E0213 19:22:23.921417 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.921434 kubelet[2835]: W0213 19:22:23.921431 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.921632 kubelet[2835]: E0213 19:22:23.921444 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.921694 kubelet[2835]: E0213 19:22:23.921682 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.921694 kubelet[2835]: W0213 19:22:23.921692 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.921863 kubelet[2835]: E0213 19:22:23.921704 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.922047 kubelet[2835]: E0213 19:22:23.922006 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.922209 kubelet[2835]: W0213 19:22:23.922131 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.922209 kubelet[2835]: E0213 19:22:23.922145 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.922569 kubelet[2835]: E0213 19:22:23.922479 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.922635 kubelet[2835]: W0213 19:22:23.922622 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.922677 kubelet[2835]: E0213 19:22:23.922636 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.922801 kubelet[2835]: E0213 19:22:23.922748 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.922801 kubelet[2835]: W0213 19:22:23.922757 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.922801 kubelet[2835]: E0213 19:22:23.922765 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.922987 kubelet[2835]: E0213 19:22:23.922907 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.922987 kubelet[2835]: W0213 19:22:23.922917 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.922987 kubelet[2835]: E0213 19:22:23.922924 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.923756 kubelet[2835]: E0213 19:22:23.923705 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.923756 kubelet[2835]: W0213 19:22:23.923714 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.923918 kubelet[2835]: E0213 19:22:23.923835 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.924035 kubelet[2835]: E0213 19:22:23.924015 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.924161 kubelet[2835]: W0213 19:22:23.924083 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.924161 kubelet[2835]: E0213 19:22:23.924122 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.924501 kubelet[2835]: E0213 19:22:23.924399 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.924501 kubelet[2835]: W0213 19:22:23.924422 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.924501 kubelet[2835]: E0213 19:22:23.924433 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.924746 kubelet[2835]: E0213 19:22:23.924690 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.924746 kubelet[2835]: W0213 19:22:23.924700 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.924746 kubelet[2835]: E0213 19:22:23.924714 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.925009 kubelet[2835]: E0213 19:22:23.924991 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.925009 kubelet[2835]: W0213 19:22:23.925000 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.925161 kubelet[2835]: E0213 19:22:23.925080 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.925361 kubelet[2835]: E0213 19:22:23.925342 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.925361 kubelet[2835]: W0213 19:22:23.925351 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.925569 kubelet[2835]: E0213 19:22:23.925466 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.925748 kubelet[2835]: E0213 19:22:23.925740 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.926130 kubelet[2835]: W0213 19:22:23.926074 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.926130 kubelet[2835]: E0213 19:22:23.926086 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.934684 kubelet[2835]: E0213 19:22:23.926442 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.934684 kubelet[2835]: W0213 19:22:23.926451 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.934684 kubelet[2835]: E0213 19:22:23.926467 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.934684 kubelet[2835]: E0213 19:22:23.926636 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.934684 kubelet[2835]: W0213 19:22:23.926644 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.934684 kubelet[2835]: E0213 19:22:23.926655 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.934684 kubelet[2835]: E0213 19:22:23.926852 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.934684 kubelet[2835]: W0213 19:22:23.926861 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.934684 kubelet[2835]: E0213 19:22:23.926872 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.934684 kubelet[2835]: E0213 19:22:23.927136 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.934866 kubelet[2835]: W0213 19:22:23.927145 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.934866 kubelet[2835]: E0213 19:22:23.927217 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:23.934866 kubelet[2835]: E0213 19:22:23.927267 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:23.934866 kubelet[2835]: W0213 19:22:23.927274 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:23.934866 kubelet[2835]: E0213 19:22:23.927318 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.918668 kubelet[2835]: E0213 19:22:24.918641 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.918668 kubelet[2835]: W0213 19:22:24.918658 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.918668 kubelet[2835]: E0213 19:22:24.918672 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919095 kubelet[2835]: E0213 19:22:24.918798 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919095 kubelet[2835]: W0213 19:22:24.918822 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919095 kubelet[2835]: E0213 19:22:24.918830 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919095 kubelet[2835]: E0213 19:22:24.918930 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919095 kubelet[2835]: W0213 19:22:24.918936 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919095 kubelet[2835]: E0213 19:22:24.918941 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919095 kubelet[2835]: E0213 19:22:24.919046 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919095 kubelet[2835]: W0213 19:22:24.919051 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919095 kubelet[2835]: E0213 19:22:24.919056 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919458 kubelet[2835]: E0213 19:22:24.919153 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919458 kubelet[2835]: W0213 19:22:24.919157 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919458 kubelet[2835]: E0213 19:22:24.919162 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919458 kubelet[2835]: E0213 19:22:24.919260 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919458 kubelet[2835]: W0213 19:22:24.919264 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919458 kubelet[2835]: E0213 19:22:24.919269 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919458 kubelet[2835]: E0213 19:22:24.919383 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919458 kubelet[2835]: W0213 19:22:24.919388 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919458 kubelet[2835]: E0213 19:22:24.919393 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919784 kubelet[2835]: E0213 19:22:24.919488 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919784 kubelet[2835]: W0213 19:22:24.919493 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919784 kubelet[2835]: E0213 19:22:24.919497 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919784 kubelet[2835]: E0213 19:22:24.919597 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919784 kubelet[2835]: W0213 19:22:24.919601 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919784 kubelet[2835]: E0213 19:22:24.919618 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.919784 kubelet[2835]: E0213 19:22:24.919715 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.919784 kubelet[2835]: W0213 19:22:24.919720 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.919784 kubelet[2835]: E0213 19:22:24.919724 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.920090 kubelet[2835]: E0213 19:22:24.919836 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.920090 kubelet[2835]: W0213 19:22:24.919841 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.920090 kubelet[2835]: E0213 19:22:24.919846 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.920090 kubelet[2835]: E0213 19:22:24.919941 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.920090 kubelet[2835]: W0213 19:22:24.919945 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.920090 kubelet[2835]: E0213 19:22:24.919950 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.920090 kubelet[2835]: E0213 19:22:24.920049 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.920090 kubelet[2835]: W0213 19:22:24.920053 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.920090 kubelet[2835]: E0213 19:22:24.920058 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.920614 kubelet[2835]: E0213 19:22:24.920172 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.920614 kubelet[2835]: W0213 19:22:24.920177 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.920614 kubelet[2835]: E0213 19:22:24.920182 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.920614 kubelet[2835]: E0213 19:22:24.920279 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.920614 kubelet[2835]: W0213 19:22:24.920284 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.920614 kubelet[2835]: E0213 19:22:24.920288 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.930700 kubelet[2835]: E0213 19:22:24.930679 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.930700 kubelet[2835]: W0213 19:22:24.930694 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.930797 kubelet[2835]: E0213 19:22:24.930708 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937306 kubelet[2835]: E0213 19:22:24.930915 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937306 kubelet[2835]: W0213 19:22:24.930922 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937306 kubelet[2835]: E0213 19:22:24.930933 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937306 kubelet[2835]: E0213 19:22:24.931090 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937306 kubelet[2835]: W0213 19:22:24.931096 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937306 kubelet[2835]: E0213 19:22:24.931105 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937306 kubelet[2835]: E0213 19:22:24.931224 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937306 kubelet[2835]: W0213 19:22:24.931228 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937306 kubelet[2835]: E0213 19:22:24.931235 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937306 kubelet[2835]: E0213 19:22:24.931330 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937479 kubelet[2835]: W0213 19:22:24.931336 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937479 kubelet[2835]: E0213 19:22:24.931363 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937479 kubelet[2835]: E0213 19:22:24.931455 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937479 kubelet[2835]: W0213 19:22:24.931460 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937479 kubelet[2835]: E0213 19:22:24.931466 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937479 kubelet[2835]: E0213 19:22:24.931610 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937479 kubelet[2835]: W0213 19:22:24.931615 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937479 kubelet[2835]: E0213 19:22:24.931625 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937479 kubelet[2835]: E0213 19:22:24.931843 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937479 kubelet[2835]: W0213 19:22:24.931848 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937695 kubelet[2835]: E0213 19:22:24.931854 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937695 kubelet[2835]: E0213 19:22:24.931998 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937695 kubelet[2835]: W0213 19:22:24.932003 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937695 kubelet[2835]: E0213 19:22:24.932009 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937695 kubelet[2835]: E0213 19:22:24.932176 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937695 kubelet[2835]: W0213 19:22:24.932181 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937695 kubelet[2835]: E0213 19:22:24.932187 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937695 kubelet[2835]: E0213 19:22:24.932560 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937695 kubelet[2835]: W0213 19:22:24.932567 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937695 kubelet[2835]: E0213 19:22:24.932583 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937886 kubelet[2835]: E0213 19:22:24.932746 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937886 kubelet[2835]: W0213 19:22:24.932751 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937886 kubelet[2835]: E0213 19:22:24.932764 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937886 kubelet[2835]: E0213 19:22:24.932886 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937886 kubelet[2835]: W0213 19:22:24.932891 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937886 kubelet[2835]: E0213 19:22:24.932903 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937886 kubelet[2835]: E0213 19:22:24.933009 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.937886 kubelet[2835]: W0213 19:22:24.933014 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.937886 kubelet[2835]: E0213 19:22:24.933025 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.937886 kubelet[2835]: E0213 19:22:24.933133 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.938040 kubelet[2835]: W0213 19:22:24.933137 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.938040 kubelet[2835]: E0213 19:22:24.933150 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.938040 kubelet[2835]: E0213 19:22:24.933287 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.938040 kubelet[2835]: W0213 19:22:24.933292 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.938040 kubelet[2835]: E0213 19:22:24.933302 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.938040 kubelet[2835]: E0213 19:22:24.933458 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.938040 kubelet[2835]: W0213 19:22:24.933463 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.938040 kubelet[2835]: E0213 19:22:24.933473 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:24.938040 kubelet[2835]: E0213 19:22:24.933581 2835 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:22:24.938040 kubelet[2835]: W0213 19:22:24.933585 2835 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:22:24.938242 kubelet[2835]: E0213 19:22:24.933590 2835 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:22:25.246783 containerd[1558]: time="2025-02-13T19:22:25.246214835Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:25.246783 containerd[1558]: time="2025-02-13T19:22:25.246598775Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Feb 13 19:22:25.246783 containerd[1558]: time="2025-02-13T19:22:25.246752559Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:25.248119 containerd[1558]: time="2025-02-13T19:22:25.248102954Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:25.254148 containerd[1558]: time="2025-02-13T19:22:25.254124047Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.867579189s" Feb 13 19:22:25.254148 containerd[1558]: time="2025-02-13T19:22:25.254147488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 19:22:25.256962 containerd[1558]: time="2025-02-13T19:22:25.256936075Z" level=info msg="CreateContainer within sandbox \"3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 19:22:25.330185 containerd[1558]: time="2025-02-13T19:22:25.330112497Z" level=info msg="CreateContainer within sandbox \"3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40\"" Feb 13 19:22:25.330750 containerd[1558]: time="2025-02-13T19:22:25.330579552Z" level=info msg="StartContainer for \"684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40\"" Feb 13 19:22:25.358012 systemd[1]: Started cri-containerd-684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40.scope - libcontainer container 684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40. Feb 13 19:22:25.381078 containerd[1558]: time="2025-02-13T19:22:25.381004944Z" level=info msg="StartContainer for \"684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40\" returns successfully" Feb 13 19:22:25.391491 systemd[1]: cri-containerd-684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40.scope: Deactivated successfully. Feb 13 19:22:25.407224 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40-rootfs.mount: Deactivated successfully. Feb 13 19:22:25.811233 kubelet[2835]: E0213 19:22:25.810981 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:25.990136 containerd[1558]: time="2025-02-13T19:22:25.985309078Z" level=info msg="shim disconnected" id=684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40 namespace=k8s.io Feb 13 19:22:25.990136 containerd[1558]: time="2025-02-13T19:22:25.990134029Z" level=warning msg="cleaning up after shim disconnected" id=684670c1be685220e7a36c86112af510adfed1a4942e62607b8342678eff6c40 namespace=k8s.io Feb 13 19:22:25.990136 containerd[1558]: time="2025-02-13T19:22:25.990141066Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 19:22:26.906739 containerd[1558]: time="2025-02-13T19:22:26.906707922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 19:22:27.812324 kubelet[2835]: E0213 19:22:27.812004 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:29.820922 kubelet[2835]: E0213 19:22:29.820338 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:31.824666 kubelet[2835]: E0213 19:22:31.824517 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:32.038854 containerd[1558]: time="2025-02-13T19:22:32.038334394Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:32.039533 containerd[1558]: time="2025-02-13T19:22:32.039501603Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 19:22:32.043962 containerd[1558]: time="2025-02-13T19:22:32.043904829Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:32.045078 containerd[1558]: time="2025-02-13T19:22:32.044680088Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 5.137928423s" Feb 13 19:22:32.045078 containerd[1558]: time="2025-02-13T19:22:32.044710613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 19:22:32.045753 containerd[1558]: time="2025-02-13T19:22:32.045354913Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:32.047516 containerd[1558]: time="2025-02-13T19:22:32.047480611Z" level=info msg="CreateContainer within sandbox \"3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 19:22:32.070538 containerd[1558]: time="2025-02-13T19:22:32.070447550Z" level=info msg="CreateContainer within sandbox \"3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d\"" Feb 13 19:22:32.071496 containerd[1558]: time="2025-02-13T19:22:32.071404729Z" level=info msg="StartContainer for \"31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d\"" Feb 13 19:22:32.124375 systemd[1]: run-containerd-runc-k8s.io-31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d-runc.K3OJCL.mount: Deactivated successfully. Feb 13 19:22:32.131970 systemd[1]: Started cri-containerd-31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d.scope - libcontainer container 31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d. Feb 13 19:22:32.197758 containerd[1558]: time="2025-02-13T19:22:32.197706865Z" level=info msg="StartContainer for \"31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d\" returns successfully" Feb 13 19:22:33.810842 kubelet[2835]: E0213 19:22:33.810377 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:34.346132 systemd[1]: cri-containerd-31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d.scope: Deactivated successfully. Feb 13 19:22:34.346491 systemd[1]: cri-containerd-31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d.scope: Consumed 296ms CPU time, 144.4M memory peak, 12K read from disk, 151M written to disk. Feb 13 19:22:34.398863 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d-rootfs.mount: Deactivated successfully. Feb 13 19:22:34.471117 kubelet[2835]: I0213 19:22:34.471098 2835 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Feb 13 19:22:34.665436 systemd[1]: Created slice kubepods-besteffort-pod831dd8f9_aeed_441d_a3bb_5e45b59bb4ba.slice - libcontainer container kubepods-besteffort-pod831dd8f9_aeed_441d_a3bb_5e45b59bb4ba.slice. Feb 13 19:22:34.685601 systemd[1]: Created slice kubepods-besteffort-pod40b4dc54_80eb_4f55_8dea_43e4da051ec7.slice - libcontainer container kubepods-besteffort-pod40b4dc54_80eb_4f55_8dea_43e4da051ec7.slice. Feb 13 19:22:34.690171 kubelet[2835]: W0213 19:22:34.690075 2835 reflector.go:569] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Feb 13 19:22:34.694658 kubelet[2835]: E0213 19:22:34.694601 2835 reflector.go:166] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Feb 13 19:22:34.694808 kubelet[2835]: W0213 19:22:34.694682 2835 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'localhost' and this object Feb 13 19:22:34.694808 kubelet[2835]: E0213 19:22:34.694695 2835 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Feb 13 19:22:34.696902 systemd[1]: Created slice kubepods-burstable-pod6f8fe6ce_d949_403c_a86a_82a4773819d5.slice - libcontainer container kubepods-burstable-pod6f8fe6ce_d949_403c_a86a_82a4773819d5.slice. Feb 13 19:22:34.708619 systemd[1]: Created slice kubepods-besteffort-podcda0b9c0_3854_44b6_bafc_261c79251f6a.slice - libcontainer container kubepods-besteffort-podcda0b9c0_3854_44b6_bafc_261c79251f6a.slice. Feb 13 19:22:34.712604 systemd[1]: Created slice kubepods-burstable-podc028a42e_4dac_4c57_bfb3_5f11422685ac.slice - libcontainer container kubepods-burstable-podc028a42e_4dac_4c57_bfb3_5f11422685ac.slice. Feb 13 19:22:34.722964 kubelet[2835]: I0213 19:22:34.719889 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48ksg\" (UniqueName: \"kubernetes.io/projected/c028a42e-4dac-4c57-bfb3-5f11422685ac-kube-api-access-48ksg\") pod \"coredns-668d6bf9bc-dfpnq\" (UID: \"c028a42e-4dac-4c57-bfb3-5f11422685ac\") " pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:34.722964 kubelet[2835]: I0213 19:22:34.719912 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6f8fe6ce-d949-403c-a86a-82a4773819d5-config-volume\") pod \"coredns-668d6bf9bc-wvctv\" (UID: \"6f8fe6ce-d949-403c-a86a-82a4773819d5\") " pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:34.722964 kubelet[2835]: I0213 19:22:34.719923 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n5tr\" (UniqueName: \"kubernetes.io/projected/6f8fe6ce-d949-403c-a86a-82a4773819d5-kube-api-access-7n5tr\") pod \"coredns-668d6bf9bc-wvctv\" (UID: \"6f8fe6ce-d949-403c-a86a-82a4773819d5\") " pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:34.722964 kubelet[2835]: I0213 19:22:34.719935 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-265hn\" (UniqueName: \"kubernetes.io/projected/40b4dc54-80eb-4f55-8dea-43e4da051ec7-kube-api-access-265hn\") pod \"calico-apiserver-64b59d7dcb-8vjgc\" (UID: \"40b4dc54-80eb-4f55-8dea-43e4da051ec7\") " pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:34.722964 kubelet[2835]: I0213 19:22:34.719944 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvrd2\" (UniqueName: \"kubernetes.io/projected/cda0b9c0-3854-44b6-bafc-261c79251f6a-kube-api-access-xvrd2\") pod \"calico-apiserver-64b59d7dcb-r4fxw\" (UID: \"cda0b9c0-3854-44b6-bafc-261c79251f6a\") " pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:34.723082 kubelet[2835]: I0213 19:22:34.719957 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c028a42e-4dac-4c57-bfb3-5f11422685ac-config-volume\") pod \"coredns-668d6bf9bc-dfpnq\" (UID: \"c028a42e-4dac-4c57-bfb3-5f11422685ac\") " pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:34.723082 kubelet[2835]: I0213 19:22:34.719967 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/cda0b9c0-3854-44b6-bafc-261c79251f6a-calico-apiserver-certs\") pod \"calico-apiserver-64b59d7dcb-r4fxw\" (UID: \"cda0b9c0-3854-44b6-bafc-261c79251f6a\") " pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:34.723082 kubelet[2835]: I0213 19:22:34.719980 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/40b4dc54-80eb-4f55-8dea-43e4da051ec7-calico-apiserver-certs\") pod \"calico-apiserver-64b59d7dcb-8vjgc\" (UID: \"40b4dc54-80eb-4f55-8dea-43e4da051ec7\") " pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:34.723082 kubelet[2835]: I0213 19:22:34.719999 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/831dd8f9-aeed-441d-a3bb-5e45b59bb4ba-tigera-ca-bundle\") pod \"calico-kube-controllers-b645f959d-hxt68\" (UID: \"831dd8f9-aeed-441d-a3bb-5e45b59bb4ba\") " pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:34.723082 kubelet[2835]: I0213 19:22:34.720020 2835 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxg9c\" (UniqueName: \"kubernetes.io/projected/831dd8f9-aeed-441d-a3bb-5e45b59bb4ba-kube-api-access-mxg9c\") pod \"calico-kube-controllers-b645f959d-hxt68\" (UID: \"831dd8f9-aeed-441d-a3bb-5e45b59bb4ba\") " pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:34.727937 containerd[1558]: time="2025-02-13T19:22:34.727854007Z" level=info msg="shim disconnected" id=31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d namespace=k8s.io Feb 13 19:22:34.727937 containerd[1558]: time="2025-02-13T19:22:34.727933710Z" level=warning msg="cleaning up after shim disconnected" id=31f9cd3b29b244e7d8c32b787a215cd42440195bd38ffe88507cc80465b89c7d namespace=k8s.io Feb 13 19:22:34.727937 containerd[1558]: time="2025-02-13T19:22:34.727940183Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 19:22:34.974513 containerd[1558]: time="2025-02-13T19:22:34.974468738Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:0,}" Feb 13 19:22:35.130154 containerd[1558]: time="2025-02-13T19:22:35.130133553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 19:22:35.814971 systemd[1]: Created slice kubepods-besteffort-podd4951e2a_1949_44bc_afb1_457c1decf801.slice - libcontainer container kubepods-besteffort-podd4951e2a_1949_44bc_afb1_457c1decf801.slice. Feb 13 19:22:35.823872 kubelet[2835]: E0213 19:22:35.823849 2835 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Feb 13 19:22:35.824069 kubelet[2835]: E0213 19:22:35.823905 2835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c028a42e-4dac-4c57-bfb3-5f11422685ac-config-volume podName:c028a42e-4dac-4c57-bfb3-5f11422685ac nodeName:}" failed. No retries permitted until 2025-02-13 19:22:36.323888673 +0000 UTC m=+28.633089306 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/c028a42e-4dac-4c57-bfb3-5f11422685ac-config-volume") pod "coredns-668d6bf9bc-dfpnq" (UID: "c028a42e-4dac-4c57-bfb3-5f11422685ac") : failed to sync configmap cache: timed out waiting for the condition Feb 13 19:22:35.824262 kubelet[2835]: E0213 19:22:35.824230 2835 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Feb 13 19:22:35.824262 kubelet[2835]: E0213 19:22:35.824259 2835 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f8fe6ce-d949-403c-a86a-82a4773819d5-config-volume podName:6f8fe6ce-d949-403c-a86a-82a4773819d5 nodeName:}" failed. No retries permitted until 2025-02-13 19:22:36.324251022 +0000 UTC m=+28.633451656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/6f8fe6ce-d949-403c-a86a-82a4773819d5-config-volume") pod "coredns-668d6bf9bc-wvctv" (UID: "6f8fe6ce-d949-403c-a86a-82a4773819d5") : failed to sync configmap cache: timed out waiting for the condition Feb 13 19:22:35.835135 containerd[1558]: time="2025-02-13T19:22:35.834834488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:0,}" Feb 13 19:22:35.894685 containerd[1558]: time="2025-02-13T19:22:35.894642135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:0,}" Feb 13 19:22:35.911745 containerd[1558]: time="2025-02-13T19:22:35.911394787Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:0,}" Feb 13 19:22:36.001186 containerd[1558]: time="2025-02-13T19:22:36.001148822Z" level=error msg="Failed to destroy network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.003663 containerd[1558]: time="2025-02-13T19:22:36.003643492Z" level=error msg="encountered an error cleaning up failed sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.003703 containerd[1558]: time="2025-02-13T19:22:36.003693775Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.018521 kubelet[2835]: E0213 19:22:36.018484 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.018606 kubelet[2835]: E0213 19:22:36.018544 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:36.018606 kubelet[2835]: E0213 19:22:36.018561 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:36.018606 kubelet[2835]: E0213 19:22:36.018592 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" podUID="831dd8f9-aeed-441d-a3bb-5e45b59bb4ba" Feb 13 19:22:36.065120 containerd[1558]: time="2025-02-13T19:22:36.064850680Z" level=error msg="Failed to destroy network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.065120 containerd[1558]: time="2025-02-13T19:22:36.065066097Z" level=error msg="encountered an error cleaning up failed sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.065120 containerd[1558]: time="2025-02-13T19:22:36.065100984Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.073251 kubelet[2835]: E0213 19:22:36.065227 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.073251 kubelet[2835]: E0213 19:22:36.065265 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:36.073251 kubelet[2835]: E0213 19:22:36.065278 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:36.079727 kubelet[2835]: E0213 19:22:36.065301 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:36.085920 containerd[1558]: time="2025-02-13T19:22:36.085884782Z" level=error msg="Failed to destroy network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086051 containerd[1558]: time="2025-02-13T19:22:36.085885926Z" level=error msg="Failed to destroy network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086262 containerd[1558]: time="2025-02-13T19:22:36.086107004Z" level=error msg="encountered an error cleaning up failed sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086262 containerd[1558]: time="2025-02-13T19:22:36.086137754Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086262 containerd[1558]: time="2025-02-13T19:22:36.086166747Z" level=error msg="encountered an error cleaning up failed sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086262 containerd[1558]: time="2025-02-13T19:22:36.086197451Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086377 kubelet[2835]: E0213 19:22:36.086270 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086377 kubelet[2835]: E0213 19:22:36.086306 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:36.086377 kubelet[2835]: E0213 19:22:36.086318 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:36.086443 kubelet[2835]: E0213 19:22:36.086342 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" podUID="cda0b9c0-3854-44b6-bafc-261c79251f6a" Feb 13 19:22:36.086611 kubelet[2835]: E0213 19:22:36.086469 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.086611 kubelet[2835]: E0213 19:22:36.086481 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:36.086611 kubelet[2835]: E0213 19:22:36.086488 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:36.086693 kubelet[2835]: E0213 19:22:36.086502 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" podUID="40b4dc54-80eb-4f55-8dea-43e4da051ec7" Feb 13 19:22:36.111847 kubelet[2835]: I0213 19:22:36.111766 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc" Feb 13 19:22:36.113991 kubelet[2835]: I0213 19:22:36.113867 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1" Feb 13 19:22:36.196539 kubelet[2835]: I0213 19:22:36.196513 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518" Feb 13 19:22:36.197560 kubelet[2835]: I0213 19:22:36.197541 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5" Feb 13 19:22:36.218608 containerd[1558]: time="2025-02-13T19:22:36.218555011Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:22:36.218608 containerd[1558]: time="2025-02-13T19:22:36.218576206Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.220548049Z" level=info msg="Ensure that sandbox 9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1 in task-service has been cleanup successfully" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.220715264Z" level=info msg="TearDown network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" successfully" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.220726367Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" returns successfully" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.220855159Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.220975048Z" level=info msg="Ensure that sandbox b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc in task-service has been cleanup successfully" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.221087106Z" level=info msg="TearDown network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" successfully" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.221096493Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" returns successfully" Feb 13 19:22:36.221263 containerd[1558]: time="2025-02-13T19:22:36.221183611Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.221295417Z" level=info msg="Ensure that sandbox 9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5 in task-service has been cleanup successfully" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.221501289Z" level=info msg="TearDown network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" successfully" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.221510741Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" returns successfully" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.221835724Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:1,}" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.222341783Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:1,}" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.222480603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:1,}" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.223178802Z" level=info msg="Ensure that sandbox eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518 in task-service has been cleanup successfully" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.223298310Z" level=info msg="TearDown network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" successfully" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.223308678Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" returns successfully" Feb 13 19:22:36.228572 containerd[1558]: time="2025-02-13T19:22:36.223625904Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:1,}" Feb 13 19:22:36.400184 systemd[1]: run-netns-cni\x2d9f224262\x2d02a6\x2ddeb6\x2d60f9\x2d30f9df791baa.mount: Deactivated successfully. Feb 13 19:22:36.400253 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1-shm.mount: Deactivated successfully. Feb 13 19:22:36.400308 systemd[1]: run-netns-cni\x2def9ed579\x2dc16d\x2d9076\x2d66f2\x2db4196feffb9e.mount: Deactivated successfully. Feb 13 19:22:36.400353 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5-shm.mount: Deactivated successfully. Feb 13 19:22:36.500056 containerd[1558]: time="2025-02-13T19:22:36.499842392Z" level=error msg="Failed to destroy network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.501992 containerd[1558]: time="2025-02-13T19:22:36.501872084Z" level=error msg="encountered an error cleaning up failed sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.501992 containerd[1558]: time="2025-02-13T19:22:36.501939006Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.502475 kubelet[2835]: E0213 19:22:36.502122 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.502475 kubelet[2835]: E0213 19:22:36.502158 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:36.502475 kubelet[2835]: E0213 19:22:36.502173 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:36.502541 kubelet[2835]: E0213 19:22:36.502201 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" podUID="cda0b9c0-3854-44b6-bafc-261c79251f6a" Feb 13 19:22:36.511265 containerd[1558]: time="2025-02-13T19:22:36.511130770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:0,}" Feb 13 19:22:36.515567 containerd[1558]: time="2025-02-13T19:22:36.515545711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:0,}" Feb 13 19:22:36.548224 containerd[1558]: time="2025-02-13T19:22:36.547994625Z" level=error msg="Failed to destroy network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.549404 containerd[1558]: time="2025-02-13T19:22:36.549382256Z" level=error msg="Failed to destroy network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.550060 containerd[1558]: time="2025-02-13T19:22:36.550038951Z" level=error msg="encountered an error cleaning up failed sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.550186 containerd[1558]: time="2025-02-13T19:22:36.550174526Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.550432 containerd[1558]: time="2025-02-13T19:22:36.550415106Z" level=error msg="encountered an error cleaning up failed sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.550470 kubelet[2835]: E0213 19:22:36.550362 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.550514 kubelet[2835]: E0213 19:22:36.550485 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:36.550514 kubelet[2835]: E0213 19:22:36.550499 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:36.550554 kubelet[2835]: E0213 19:22:36.550529 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:36.552628 containerd[1558]: time="2025-02-13T19:22:36.552368371Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.553142 kubelet[2835]: E0213 19:22:36.553078 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.553277 kubelet[2835]: E0213 19:22:36.553255 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:36.553367 kubelet[2835]: E0213 19:22:36.553353 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:36.553509 kubelet[2835]: E0213 19:22:36.553447 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" podUID="40b4dc54-80eb-4f55-8dea-43e4da051ec7" Feb 13 19:22:36.562116 containerd[1558]: time="2025-02-13T19:22:36.561431860Z" level=error msg="Failed to destroy network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.562116 containerd[1558]: time="2025-02-13T19:22:36.561909158Z" level=error msg="encountered an error cleaning up failed sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.562116 containerd[1558]: time="2025-02-13T19:22:36.561941508Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.562261 kubelet[2835]: E0213 19:22:36.562078 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.562261 kubelet[2835]: E0213 19:22:36.562113 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:36.562261 kubelet[2835]: E0213 19:22:36.562128 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:36.562326 kubelet[2835]: E0213 19:22:36.562152 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" podUID="831dd8f9-aeed-441d-a3bb-5e45b59bb4ba" Feb 13 19:22:36.596253 containerd[1558]: time="2025-02-13T19:22:36.596145907Z" level=error msg="Failed to destroy network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.596421 containerd[1558]: time="2025-02-13T19:22:36.596292582Z" level=error msg="Failed to destroy network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.596592 containerd[1558]: time="2025-02-13T19:22:36.596578853Z" level=error msg="encountered an error cleaning up failed sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.596668 containerd[1558]: time="2025-02-13T19:22:36.596655864Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.596748 containerd[1558]: time="2025-02-13T19:22:36.596587572Z" level=error msg="encountered an error cleaning up failed sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.596802 containerd[1558]: time="2025-02-13T19:22:36.596792460Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.596990 kubelet[2835]: E0213 19:22:36.596960 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.597030 kubelet[2835]: E0213 19:22:36.597002 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:36.597030 kubelet[2835]: E0213 19:22:36.597014 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:36.597070 kubelet[2835]: E0213 19:22:36.597036 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dfpnq" podUID="c028a42e-4dac-4c57-bfb3-5f11422685ac" Feb 13 19:22:36.597070 kubelet[2835]: E0213 19:22:36.597065 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:36.597136 kubelet[2835]: E0213 19:22:36.597076 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:36.597136 kubelet[2835]: E0213 19:22:36.597086 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:36.597136 kubelet[2835]: E0213 19:22:36.597109 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wvctv" podUID="6f8fe6ce-d949-403c-a86a-82a4773819d5" Feb 13 19:22:37.200836 kubelet[2835]: I0213 19:22:37.200042 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176" Feb 13 19:22:37.201214 containerd[1558]: time="2025-02-13T19:22:37.201193858Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" Feb 13 19:22:37.201467 containerd[1558]: time="2025-02-13T19:22:37.201453107Z" level=info msg="Ensure that sandbox 2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176 in task-service has been cleanup successfully" Feb 13 19:22:37.201792 containerd[1558]: time="2025-02-13T19:22:37.201679676Z" level=info msg="TearDown network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" successfully" Feb 13 19:22:37.201860 containerd[1558]: time="2025-02-13T19:22:37.201792354Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" returns successfully" Feb 13 19:22:37.202396 containerd[1558]: time="2025-02-13T19:22:37.202200019Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:22:37.202428 containerd[1558]: time="2025-02-13T19:22:37.202419213Z" level=info msg="TearDown network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" successfully" Feb 13 19:22:37.202447 containerd[1558]: time="2025-02-13T19:22:37.202426954Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" returns successfully" Feb 13 19:22:37.202797 containerd[1558]: time="2025-02-13T19:22:37.202774256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:2,}" Feb 13 19:22:37.217998 kubelet[2835]: I0213 19:22:37.217848 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58" Feb 13 19:22:37.218768 kubelet[2835]: I0213 19:22:37.218719 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812" Feb 13 19:22:37.219801 containerd[1558]: time="2025-02-13T19:22:37.219773393Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.219931491Z" level=info msg="Ensure that sandbox 4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812 in task-service has been cleanup successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220143976Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220230057Z" level=info msg="TearDown network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220238637Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" returns successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220243217Z" level=info msg="Ensure that sandbox 2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58 in task-service has been cleanup successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220351989Z" level=info msg="TearDown network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220359908Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" returns successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220623440Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220661200Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:1,}" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220662471Z" level=info msg="TearDown network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.220861976Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" returns successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.221143993Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:2,}" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.221464999Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.221609131Z" level=info msg="Ensure that sandbox a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7 in task-service has been cleanup successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.221702332Z" level=info msg="TearDown network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.221710786Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" returns successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.221894576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:1,}" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.222884948Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.223032472Z" level=info msg="Ensure that sandbox 356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b in task-service has been cleanup successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.223164195Z" level=info msg="TearDown network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.223172452Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" returns successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.223541461Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.223581215Z" level=info msg="TearDown network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.223586913Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" returns successfully" Feb 13 19:22:37.225119 containerd[1558]: time="2025-02-13T19:22:37.223883367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:2,}" Feb 13 19:22:37.230498 kubelet[2835]: I0213 19:22:37.220943 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7" Feb 13 19:22:37.230498 kubelet[2835]: I0213 19:22:37.222563 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b" Feb 13 19:22:37.230498 kubelet[2835]: I0213 19:22:37.225437 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.226768694Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.226999803Z" level=info msg="Ensure that sandbox 8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0 in task-service has been cleanup successfully" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.227113774Z" level=info msg="TearDown network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" successfully" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.227122349Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" returns successfully" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.227362547Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.227408010Z" level=info msg="TearDown network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" successfully" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.227433094Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" returns successfully" Feb 13 19:22:37.230570 containerd[1558]: time="2025-02-13T19:22:37.227677170Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:2,}" Feb 13 19:22:37.398761 systemd[1]: run-netns-cni\x2d35eeed57\x2d752d\x2d036b\x2da480\x2d70067bc1f587.mount: Deactivated successfully. Feb 13 19:22:37.398831 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0-shm.mount: Deactivated successfully. Feb 13 19:22:37.398880 systemd[1]: run-netns-cni\x2d0759c3c9\x2d588d\x2d56c4\x2d4e8d\x2d3449d8e38929.mount: Deactivated successfully. Feb 13 19:22:37.398916 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58-shm.mount: Deactivated successfully. Feb 13 19:22:37.398954 systemd[1]: run-netns-cni\x2d2438bd43\x2dd302\x2ddc4e\x2d395c\x2d6959e08a5487.mount: Deactivated successfully. Feb 13 19:22:37.398990 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b-shm.mount: Deactivated successfully. Feb 13 19:22:37.899601 containerd[1558]: time="2025-02-13T19:22:37.899292635Z" level=error msg="Failed to destroy network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.899601 containerd[1558]: time="2025-02-13T19:22:37.899489087Z" level=error msg="encountered an error cleaning up failed sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.899601 containerd[1558]: time="2025-02-13T19:22:37.899524008Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.899757 kubelet[2835]: E0213 19:22:37.899647 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.899757 kubelet[2835]: E0213 19:22:37.899683 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:37.899757 kubelet[2835]: E0213 19:22:37.899697 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:37.899871 kubelet[2835]: E0213 19:22:37.899734 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wvctv" podUID="6f8fe6ce-d949-403c-a86a-82a4773819d5" Feb 13 19:22:37.909286 containerd[1558]: time="2025-02-13T19:22:37.909259336Z" level=error msg="Failed to destroy network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.909826 containerd[1558]: time="2025-02-13T19:22:37.909578960Z" level=error msg="encountered an error cleaning up failed sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.909826 containerd[1558]: time="2025-02-13T19:22:37.909619215Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.916830 kubelet[2835]: E0213 19:22:37.916634 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.916830 kubelet[2835]: E0213 19:22:37.916684 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:37.916830 kubelet[2835]: E0213 19:22:37.916704 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:37.916966 kubelet[2835]: E0213 19:22:37.916735 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" podUID="831dd8f9-aeed-441d-a3bb-5e45b59bb4ba" Feb 13 19:22:37.923450 containerd[1558]: time="2025-02-13T19:22:37.923223514Z" level=error msg="Failed to destroy network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.924302 containerd[1558]: time="2025-02-13T19:22:37.924272027Z" level=error msg="encountered an error cleaning up failed sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.925085 containerd[1558]: time="2025-02-13T19:22:37.924325236Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.925162 kubelet[2835]: E0213 19:22:37.924497 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.925162 kubelet[2835]: E0213 19:22:37.924546 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:37.925162 kubelet[2835]: E0213 19:22:37.924566 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:37.925232 kubelet[2835]: E0213 19:22:37.924601 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" podUID="cda0b9c0-3854-44b6-bafc-261c79251f6a" Feb 13 19:22:37.930641 containerd[1558]: time="2025-02-13T19:22:37.930607632Z" level=error msg="Failed to destroy network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.930861 containerd[1558]: time="2025-02-13T19:22:37.930843171Z" level=error msg="encountered an error cleaning up failed sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.930896 containerd[1558]: time="2025-02-13T19:22:37.930880343Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.931590 kubelet[2835]: E0213 19:22:37.931005 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.931590 kubelet[2835]: E0213 19:22:37.931044 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:37.931590 kubelet[2835]: E0213 19:22:37.931060 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:37.931677 kubelet[2835]: E0213 19:22:37.931086 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dfpnq" podUID="c028a42e-4dac-4c57-bfb3-5f11422685ac" Feb 13 19:22:37.939826 containerd[1558]: time="2025-02-13T19:22:37.939787357Z" level=error msg="Failed to destroy network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.940303 containerd[1558]: time="2025-02-13T19:22:37.940286845Z" level=error msg="encountered an error cleaning up failed sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.940349 containerd[1558]: time="2025-02-13T19:22:37.940323480Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.940683 kubelet[2835]: E0213 19:22:37.940467 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.940683 kubelet[2835]: E0213 19:22:37.940503 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:37.940683 kubelet[2835]: E0213 19:22:37.940513 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:37.940759 kubelet[2835]: E0213 19:22:37.940539 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" podUID="40b4dc54-80eb-4f55-8dea-43e4da051ec7" Feb 13 19:22:37.953494 containerd[1558]: time="2025-02-13T19:22:37.953449157Z" level=error msg="Failed to destroy network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.954904 containerd[1558]: time="2025-02-13T19:22:37.954421325Z" level=error msg="encountered an error cleaning up failed sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.954904 containerd[1558]: time="2025-02-13T19:22:37.954460370Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.955081 kubelet[2835]: E0213 19:22:37.954586 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:37.955081 kubelet[2835]: E0213 19:22:37.954620 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:37.955081 kubelet[2835]: E0213 19:22:37.954633 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:37.955156 kubelet[2835]: E0213 19:22:37.954661 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:38.227670 kubelet[2835]: I0213 19:22:38.227649 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd" Feb 13 19:22:38.228704 containerd[1558]: time="2025-02-13T19:22:38.228461321Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\"" Feb 13 19:22:38.228704 containerd[1558]: time="2025-02-13T19:22:38.228620819Z" level=info msg="Ensure that sandbox 023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd in task-service has been cleanup successfully" Feb 13 19:22:38.229089 containerd[1558]: time="2025-02-13T19:22:38.229077515Z" level=info msg="TearDown network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" successfully" Feb 13 19:22:38.229211 containerd[1558]: time="2025-02-13T19:22:38.229089657Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" returns successfully" Feb 13 19:22:38.229689 containerd[1558]: time="2025-02-13T19:22:38.229633410Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" Feb 13 19:22:38.230462 containerd[1558]: time="2025-02-13T19:22:38.230379034Z" level=info msg="TearDown network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" successfully" Feb 13 19:22:38.230462 containerd[1558]: time="2025-02-13T19:22:38.230390493Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" returns successfully" Feb 13 19:22:38.230937 containerd[1558]: time="2025-02-13T19:22:38.230840815Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:22:38.231931 containerd[1558]: time="2025-02-13T19:22:38.231920691Z" level=info msg="TearDown network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" successfully" Feb 13 19:22:38.231996 containerd[1558]: time="2025-02-13T19:22:38.231987076Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" returns successfully" Feb 13 19:22:38.232269 kubelet[2835]: I0213 19:22:38.232222 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db" Feb 13 19:22:38.233029 containerd[1558]: time="2025-02-13T19:22:38.233009312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:3,}" Feb 13 19:22:38.233374 containerd[1558]: time="2025-02-13T19:22:38.233016659Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\"" Feb 13 19:22:38.233732 containerd[1558]: time="2025-02-13T19:22:38.233579062Z" level=info msg="Ensure that sandbox b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db in task-service has been cleanup successfully" Feb 13 19:22:38.234417 containerd[1558]: time="2025-02-13T19:22:38.234405538Z" level=info msg="TearDown network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" successfully" Feb 13 19:22:38.234522 containerd[1558]: time="2025-02-13T19:22:38.234513870Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" returns successfully" Feb 13 19:22:38.235622 containerd[1558]: time="2025-02-13T19:22:38.235525419Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" Feb 13 19:22:38.235622 containerd[1558]: time="2025-02-13T19:22:38.235585328Z" level=info msg="TearDown network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" successfully" Feb 13 19:22:38.235622 containerd[1558]: time="2025-02-13T19:22:38.235595540Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" returns successfully" Feb 13 19:22:38.237278 containerd[1558]: time="2025-02-13T19:22:38.237254620Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:22:38.237461 containerd[1558]: time="2025-02-13T19:22:38.237397446Z" level=info msg="TearDown network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" successfully" Feb 13 19:22:38.237461 containerd[1558]: time="2025-02-13T19:22:38.237406816Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" returns successfully" Feb 13 19:22:38.238026 containerd[1558]: time="2025-02-13T19:22:38.237966094Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:3,}" Feb 13 19:22:38.238061 kubelet[2835]: I0213 19:22:38.238000 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82" Feb 13 19:22:38.239012 containerd[1558]: time="2025-02-13T19:22:38.238637386Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\"" Feb 13 19:22:38.239012 containerd[1558]: time="2025-02-13T19:22:38.238739120Z" level=info msg="Ensure that sandbox ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82 in task-service has been cleanup successfully" Feb 13 19:22:38.239437 containerd[1558]: time="2025-02-13T19:22:38.239426893Z" level=info msg="TearDown network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" successfully" Feb 13 19:22:38.239744 containerd[1558]: time="2025-02-13T19:22:38.239734381Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" returns successfully" Feb 13 19:22:38.241206 containerd[1558]: time="2025-02-13T19:22:38.241194197Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" Feb 13 19:22:38.241373 containerd[1558]: time="2025-02-13T19:22:38.241327336Z" level=info msg="TearDown network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" successfully" Feb 13 19:22:38.241373 containerd[1558]: time="2025-02-13T19:22:38.241343451Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" returns successfully" Feb 13 19:22:38.242900 containerd[1558]: time="2025-02-13T19:22:38.242644378Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:22:38.242900 containerd[1558]: time="2025-02-13T19:22:38.242686273Z" level=info msg="TearDown network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" successfully" Feb 13 19:22:38.242900 containerd[1558]: time="2025-02-13T19:22:38.242692254Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" returns successfully" Feb 13 19:22:38.243080 containerd[1558]: time="2025-02-13T19:22:38.243070333Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:3,}" Feb 13 19:22:38.243916 kubelet[2835]: I0213 19:22:38.243603 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02" Feb 13 19:22:38.258722 containerd[1558]: time="2025-02-13T19:22:38.258698605Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\"" Feb 13 19:22:38.258965 containerd[1558]: time="2025-02-13T19:22:38.258949013Z" level=info msg="Ensure that sandbox 33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02 in task-service has been cleanup successfully" Feb 13 19:22:38.259789 containerd[1558]: time="2025-02-13T19:22:38.259775340Z" level=info msg="TearDown network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" successfully" Feb 13 19:22:38.259789 containerd[1558]: time="2025-02-13T19:22:38.259786106Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" returns successfully" Feb 13 19:22:38.260427 containerd[1558]: time="2025-02-13T19:22:38.260238728Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" Feb 13 19:22:38.260427 containerd[1558]: time="2025-02-13T19:22:38.260372278Z" level=info msg="TearDown network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" successfully" Feb 13 19:22:38.260427 containerd[1558]: time="2025-02-13T19:22:38.260380117Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" returns successfully" Feb 13 19:22:38.260759 kubelet[2835]: I0213 19:22:38.260568 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f" Feb 13 19:22:38.263363 containerd[1558]: time="2025-02-13T19:22:38.262228128Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:22:38.263363 containerd[1558]: time="2025-02-13T19:22:38.262289688Z" level=info msg="TearDown network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" successfully" Feb 13 19:22:38.263363 containerd[1558]: time="2025-02-13T19:22:38.262296418Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" returns successfully" Feb 13 19:22:38.265795 containerd[1558]: time="2025-02-13T19:22:38.265777954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:3,}" Feb 13 19:22:38.266584 containerd[1558]: time="2025-02-13T19:22:38.266571531Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" Feb 13 19:22:38.266787 containerd[1558]: time="2025-02-13T19:22:38.266746095Z" level=info msg="Ensure that sandbox 212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f in task-service has been cleanup successfully" Feb 13 19:22:38.268075 kubelet[2835]: I0213 19:22:38.268064 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241" Feb 13 19:22:38.269229 containerd[1558]: time="2025-02-13T19:22:38.269212372Z" level=info msg="TearDown network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" successfully" Feb 13 19:22:38.270911 containerd[1558]: time="2025-02-13T19:22:38.270893502Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" returns successfully" Feb 13 19:22:38.271003 containerd[1558]: time="2025-02-13T19:22:38.269736488Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\"" Feb 13 19:22:38.271528 containerd[1558]: time="2025-02-13T19:22:38.271444751Z" level=info msg="Ensure that sandbox 7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241 in task-service has been cleanup successfully" Feb 13 19:22:38.271997 containerd[1558]: time="2025-02-13T19:22:38.271985978Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:22:38.273239 containerd[1558]: time="2025-02-13T19:22:38.273217191Z" level=info msg="TearDown network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" successfully" Feb 13 19:22:38.273299 containerd[1558]: time="2025-02-13T19:22:38.273291704Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" returns successfully" Feb 13 19:22:38.273468 containerd[1558]: time="2025-02-13T19:22:38.273172705Z" level=info msg="TearDown network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" successfully" Feb 13 19:22:38.273468 containerd[1558]: time="2025-02-13T19:22:38.273381303Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" returns successfully" Feb 13 19:22:38.274117 containerd[1558]: time="2025-02-13T19:22:38.274102054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:2,}" Feb 13 19:22:38.274824 containerd[1558]: time="2025-02-13T19:22:38.274799670Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" Feb 13 19:22:38.274931 containerd[1558]: time="2025-02-13T19:22:38.274919191Z" level=info msg="TearDown network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" successfully" Feb 13 19:22:38.274995 containerd[1558]: time="2025-02-13T19:22:38.274984587Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" returns successfully" Feb 13 19:22:38.276017 containerd[1558]: time="2025-02-13T19:22:38.276005344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:2,}" Feb 13 19:22:38.358114 containerd[1558]: time="2025-02-13T19:22:38.358087180Z" level=error msg="Failed to destroy network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.358529 containerd[1558]: time="2025-02-13T19:22:38.358439282Z" level=error msg="encountered an error cleaning up failed sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.358529 containerd[1558]: time="2025-02-13T19:22:38.358477602Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.358829 kubelet[2835]: E0213 19:22:38.358702 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.358829 kubelet[2835]: E0213 19:22:38.358739 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:38.358829 kubelet[2835]: E0213 19:22:38.358755 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:38.359753 kubelet[2835]: E0213 19:22:38.358811 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" podUID="cda0b9c0-3854-44b6-bafc-261c79251f6a" Feb 13 19:22:38.395833 containerd[1558]: time="2025-02-13T19:22:38.395692595Z" level=error msg="Failed to destroy network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.395926 containerd[1558]: time="2025-02-13T19:22:38.395910633Z" level=error msg="encountered an error cleaning up failed sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.395961 containerd[1558]: time="2025-02-13T19:22:38.395949119Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.396366 kubelet[2835]: E0213 19:22:38.396120 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.396366 kubelet[2835]: E0213 19:22:38.396266 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:38.396366 kubelet[2835]: E0213 19:22:38.396279 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:38.396444 kubelet[2835]: E0213 19:22:38.396319 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" podUID="40b4dc54-80eb-4f55-8dea-43e4da051ec7" Feb 13 19:22:38.405531 containerd[1558]: time="2025-02-13T19:22:38.405375666Z" level=error msg="Failed to destroy network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.407327 systemd[1]: run-netns-cni\x2d98d9bda8\x2dcf03\x2de9ff\x2d74a0\x2dc87b19b78a55.mount: Deactivated successfully. Feb 13 19:22:38.407396 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f-shm.mount: Deactivated successfully. Feb 13 19:22:38.407440 systemd[1]: run-netns-cni\x2d8abf650c\x2deca2\x2da469\x2da542\x2d307918aae112.mount: Deactivated successfully. Feb 13 19:22:38.407478 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82-shm.mount: Deactivated successfully. Feb 13 19:22:38.412187 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06-shm.mount: Deactivated successfully. Feb 13 19:22:38.414188 kubelet[2835]: E0213 19:22:38.410885 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.414188 kubelet[2835]: E0213 19:22:38.410912 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:38.414188 kubelet[2835]: E0213 19:22:38.410924 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:38.414258 containerd[1558]: time="2025-02-13T19:22:38.408861077Z" level=error msg="encountered an error cleaning up failed sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.414258 containerd[1558]: time="2025-02-13T19:22:38.408919315Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.414303 kubelet[2835]: E0213 19:22:38.410947 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wvctv" podUID="6f8fe6ce-d949-403c-a86a-82a4773819d5" Feb 13 19:22:38.415721 containerd[1558]: time="2025-02-13T19:22:38.415696020Z" level=error msg="Failed to destroy network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.417780 containerd[1558]: time="2025-02-13T19:22:38.417757254Z" level=error msg="encountered an error cleaning up failed sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.417887 containerd[1558]: time="2025-02-13T19:22:38.417874840Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.418963 kubelet[2835]: E0213 19:22:38.418072 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.418963 kubelet[2835]: E0213 19:22:38.418107 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:38.418963 kubelet[2835]: E0213 19:22:38.418120 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:38.418190 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777-shm.mount: Deactivated successfully. Feb 13 19:22:38.419090 kubelet[2835]: E0213 19:22:38.418151 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" podUID="831dd8f9-aeed-441d-a3bb-5e45b59bb4ba" Feb 13 19:22:38.425549 containerd[1558]: time="2025-02-13T19:22:38.425520250Z" level=error msg="Failed to destroy network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.427369 containerd[1558]: time="2025-02-13T19:22:38.427123063Z" level=error msg="encountered an error cleaning up failed sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.427268 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549-shm.mount: Deactivated successfully. Feb 13 19:22:38.428464 containerd[1558]: time="2025-02-13T19:22:38.427989693Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.428660 kubelet[2835]: E0213 19:22:38.428637 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.428701 kubelet[2835]: E0213 19:22:38.428674 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:38.428701 kubelet[2835]: E0213 19:22:38.428690 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:38.428767 kubelet[2835]: E0213 19:22:38.428734 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:38.439448 containerd[1558]: time="2025-02-13T19:22:38.439419153Z" level=error msg="Failed to destroy network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.440834 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6-shm.mount: Deactivated successfully. Feb 13 19:22:38.441402 containerd[1558]: time="2025-02-13T19:22:38.441189792Z" level=error msg="encountered an error cleaning up failed sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.441466 containerd[1558]: time="2025-02-13T19:22:38.441450888Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.441615 kubelet[2835]: E0213 19:22:38.441592 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:38.441656 kubelet[2835]: E0213 19:22:38.441634 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:38.441656 kubelet[2835]: E0213 19:22:38.441648 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:38.441861 kubelet[2835]: E0213 19:22:38.441678 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dfpnq" podUID="c028a42e-4dac-4c57-bfb3-5f11422685ac" Feb 13 19:22:39.711500 kubelet[2835]: I0213 19:22:39.711479 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829" Feb 13 19:22:39.716134 containerd[1558]: time="2025-02-13T19:22:39.711945085Z" level=info msg="StopPodSandbox for \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\"" Feb 13 19:22:39.729074 containerd[1558]: time="2025-02-13T19:22:39.728959739Z" level=info msg="Ensure that sandbox ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829 in task-service has been cleanup successfully" Feb 13 19:22:39.730869 containerd[1558]: time="2025-02-13T19:22:39.729211940Z" level=info msg="TearDown network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" successfully" Feb 13 19:22:39.730869 containerd[1558]: time="2025-02-13T19:22:39.729224729Z" level=info msg="StopPodSandbox for \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" returns successfully" Feb 13 19:22:39.730721 systemd[1]: run-netns-cni\x2d81bfc5cc\x2daae8\x2d3355\x2d2784\x2de3166140c1a7.mount: Deactivated successfully. Feb 13 19:22:39.732344 containerd[1558]: time="2025-02-13T19:22:39.731788512Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\"" Feb 13 19:22:39.732344 containerd[1558]: time="2025-02-13T19:22:39.731857893Z" level=info msg="TearDown network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" successfully" Feb 13 19:22:39.732344 containerd[1558]: time="2025-02-13T19:22:39.731865184Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" returns successfully" Feb 13 19:22:39.732515 containerd[1558]: time="2025-02-13T19:22:39.732504668Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" Feb 13 19:22:39.732605 containerd[1558]: time="2025-02-13T19:22:39.732595298Z" level=info msg="TearDown network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" successfully" Feb 13 19:22:39.732658 containerd[1558]: time="2025-02-13T19:22:39.732647448Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" returns successfully" Feb 13 19:22:39.732910 containerd[1558]: time="2025-02-13T19:22:39.732898511Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:22:39.732993 containerd[1558]: time="2025-02-13T19:22:39.732985044Z" level=info msg="TearDown network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" successfully" Feb 13 19:22:39.733029 containerd[1558]: time="2025-02-13T19:22:39.733023195Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" returns successfully" Feb 13 19:22:39.733476 containerd[1558]: time="2025-02-13T19:22:39.733464909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:4,}" Feb 13 19:22:39.736980 kubelet[2835]: I0213 19:22:39.736961 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777" Feb 13 19:22:39.737616 containerd[1558]: time="2025-02-13T19:22:39.737594705Z" level=info msg="StopPodSandbox for \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\"" Feb 13 19:22:39.740621 containerd[1558]: time="2025-02-13T19:22:39.740583298Z" level=info msg="Ensure that sandbox 274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777 in task-service has been cleanup successfully" Feb 13 19:22:39.742109 containerd[1558]: time="2025-02-13T19:22:39.741115873Z" level=info msg="TearDown network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" successfully" Feb 13 19:22:39.742109 containerd[1558]: time="2025-02-13T19:22:39.741145496Z" level=info msg="StopPodSandbox for \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" returns successfully" Feb 13 19:22:39.742988 systemd[1]: run-netns-cni\x2d4eb32fe0\x2db07b\x2d7903\x2d888c\x2d59855dec8a87.mount: Deactivated successfully. Feb 13 19:22:39.743546 containerd[1558]: time="2025-02-13T19:22:39.743530228Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\"" Feb 13 19:22:39.743650 containerd[1558]: time="2025-02-13T19:22:39.743641026Z" level=info msg="TearDown network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" successfully" Feb 13 19:22:39.743794 containerd[1558]: time="2025-02-13T19:22:39.743683251Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" returns successfully" Feb 13 19:22:39.745195 containerd[1558]: time="2025-02-13T19:22:39.745180410Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" Feb 13 19:22:39.745352 containerd[1558]: time="2025-02-13T19:22:39.745339721Z" level=info msg="TearDown network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" successfully" Feb 13 19:22:39.745398 containerd[1558]: time="2025-02-13T19:22:39.745390726Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" returns successfully" Feb 13 19:22:39.746116 containerd[1558]: time="2025-02-13T19:22:39.746064198Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:22:39.746197 containerd[1558]: time="2025-02-13T19:22:39.746188415Z" level=info msg="TearDown network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" successfully" Feb 13 19:22:39.746436 containerd[1558]: time="2025-02-13T19:22:39.746227579Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" returns successfully" Feb 13 19:22:39.747519 containerd[1558]: time="2025-02-13T19:22:39.747501749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:4,}" Feb 13 19:22:39.748235 kubelet[2835]: I0213 19:22:39.748220 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6" Feb 13 19:22:39.752420 containerd[1558]: time="2025-02-13T19:22:39.752178682Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\"" Feb 13 19:22:39.753103 containerd[1558]: time="2025-02-13T19:22:39.752987955Z" level=info msg="Ensure that sandbox 64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6 in task-service has been cleanup successfully" Feb 13 19:22:39.753390 containerd[1558]: time="2025-02-13T19:22:39.753285544Z" level=info msg="TearDown network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" successfully" Feb 13 19:22:39.753390 containerd[1558]: time="2025-02-13T19:22:39.753296143Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" returns successfully" Feb 13 19:22:39.754586 systemd[1]: run-netns-cni\x2dc2265179\x2d124f\x2d407c\x2d9846\x2d151f6ee7f7fb.mount: Deactivated successfully. Feb 13 19:22:39.756036 containerd[1558]: time="2025-02-13T19:22:39.755748665Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" Feb 13 19:22:39.756358 containerd[1558]: time="2025-02-13T19:22:39.756285490Z" level=info msg="TearDown network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" successfully" Feb 13 19:22:39.756358 containerd[1558]: time="2025-02-13T19:22:39.756296196Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" returns successfully" Feb 13 19:22:39.756527 kubelet[2835]: I0213 19:22:39.756304 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06" Feb 13 19:22:39.756832 containerd[1558]: time="2025-02-13T19:22:39.756744863Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:22:39.756832 containerd[1558]: time="2025-02-13T19:22:39.756811510Z" level=info msg="TearDown network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" successfully" Feb 13 19:22:39.756832 containerd[1558]: time="2025-02-13T19:22:39.756832194Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" returns successfully" Feb 13 19:22:39.758278 containerd[1558]: time="2025-02-13T19:22:39.758071973Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:3,}" Feb 13 19:22:39.758727 containerd[1558]: time="2025-02-13T19:22:39.758694217Z" level=info msg="StopPodSandbox for \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\"" Feb 13 19:22:39.799913 containerd[1558]: time="2025-02-13T19:22:39.799889340Z" level=info msg="Ensure that sandbox ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06 in task-service has been cleanup successfully" Feb 13 19:22:39.800501 containerd[1558]: time="2025-02-13T19:22:39.800488624Z" level=info msg="TearDown network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" successfully" Feb 13 19:22:39.800552 containerd[1558]: time="2025-02-13T19:22:39.800544810Z" level=info msg="StopPodSandbox for \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" returns successfully" Feb 13 19:22:39.801569 systemd[1]: run-netns-cni\x2d9c1d9cf3\x2d04a5\x2dbe16\x2dde42\x2d29a7a3ea68f0.mount: Deactivated successfully. Feb 13 19:22:39.804484 containerd[1558]: time="2025-02-13T19:22:39.802452085Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\"" Feb 13 19:22:39.804484 containerd[1558]: time="2025-02-13T19:22:39.802575870Z" level=info msg="TearDown network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" successfully" Feb 13 19:22:39.804484 containerd[1558]: time="2025-02-13T19:22:39.802582606Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" returns successfully" Feb 13 19:22:39.806962 containerd[1558]: time="2025-02-13T19:22:39.806832651Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" Feb 13 19:22:39.807005 containerd[1558]: time="2025-02-13T19:22:39.806979791Z" level=info msg="TearDown network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" successfully" Feb 13 19:22:39.807005 containerd[1558]: time="2025-02-13T19:22:39.806990979Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" returns successfully" Feb 13 19:22:39.807579 containerd[1558]: time="2025-02-13T19:22:39.807471608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:3,}" Feb 13 19:22:39.809746 kubelet[2835]: I0213 19:22:39.809684 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1" Feb 13 19:22:39.811681 containerd[1558]: time="2025-02-13T19:22:39.811491587Z" level=info msg="StopPodSandbox for \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\"" Feb 13 19:22:39.813168 containerd[1558]: time="2025-02-13T19:22:39.813080817Z" level=info msg="Ensure that sandbox d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1 in task-service has been cleanup successfully" Feb 13 19:22:39.816541 systemd[1]: run-netns-cni\x2d5045d558\x2d5473\x2d9744\x2de10c\x2d8d0c2b1fd872.mount: Deactivated successfully. Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.817606292Z" level=info msg="TearDown network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.817623533Z" level=info msg="StopPodSandbox for \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" returns successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.817872777Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\"" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.817922519Z" level=info msg="TearDown network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.817929039Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" returns successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.818138612Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.818183797Z" level=info msg="TearDown network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.818189681Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" returns successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.818468855Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.818520025Z" level=info msg="TearDown network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.818530720Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" returns successfully" Feb 13 19:22:39.842069 containerd[1558]: time="2025-02-13T19:22:39.821491632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:4,}" Feb 13 19:22:39.899750 kubelet[2835]: I0213 19:22:39.818164 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.846169896Z" level=info msg="StopPodSandbox for \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\"" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.846319909Z" level=info msg="Ensure that sandbox 9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549 in task-service has been cleanup successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.846474550Z" level=info msg="TearDown network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.846483515Z" level=info msg="StopPodSandbox for \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" returns successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.848640494Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\"" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.849675860Z" level=info msg="TearDown network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.849685068Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" returns successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.850341758Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.850381519Z" level=info msg="TearDown network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.850387962Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" returns successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.850564091Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.850612073Z" level=info msg="TearDown network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.850618427Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" returns successfully" Feb 13 19:22:39.899788 containerd[1558]: time="2025-02-13T19:22:39.851254930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:4,}" Feb 13 19:22:40.731038 systemd[1]: run-netns-cni\x2d586d9acc\x2d9ce9\x2d3a73\x2dd868\x2d21db72ff9a32.mount: Deactivated successfully. Feb 13 19:22:40.731116 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount371483004.mount: Deactivated successfully. Feb 13 19:22:41.104571 containerd[1558]: time="2025-02-13T19:22:41.104471951Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 19:22:41.108830 containerd[1558]: time="2025-02-13T19:22:41.108012119Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:41.139385 containerd[1558]: time="2025-02-13T19:22:41.139186022Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:41.142579 containerd[1558]: time="2025-02-13T19:22:41.142545366Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:41.143988 containerd[1558]: time="2025-02-13T19:22:41.143954251Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 6.012483087s" Feb 13 19:22:41.143988 containerd[1558]: time="2025-02-13T19:22:41.143972107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 19:22:41.169058 containerd[1558]: time="2025-02-13T19:22:41.168759627Z" level=info msg="CreateContainer within sandbox \"3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 19:22:41.190266 containerd[1558]: time="2025-02-13T19:22:41.190234473Z" level=error msg="Failed to destroy network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.191432 containerd[1558]: time="2025-02-13T19:22:41.191376887Z" level=error msg="encountered an error cleaning up failed sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.192007 containerd[1558]: time="2025-02-13T19:22:41.191989138Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.192350 kubelet[2835]: E0213 19:22:41.192126 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.192350 kubelet[2835]: E0213 19:22:41.192166 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:41.192350 kubelet[2835]: E0213 19:22:41.192181 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-wvctv" Feb 13 19:22:41.192615 kubelet[2835]: E0213 19:22:41.192207 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-wvctv_kube-system(6f8fe6ce-d949-403c-a86a-82a4773819d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-wvctv" podUID="6f8fe6ce-d949-403c-a86a-82a4773819d5" Feb 13 19:22:41.200309 containerd[1558]: time="2025-02-13T19:22:41.200278677Z" level=error msg="Failed to destroy network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.200494 containerd[1558]: time="2025-02-13T19:22:41.200477411Z" level=error msg="encountered an error cleaning up failed sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.200526 containerd[1558]: time="2025-02-13T19:22:41.200515595Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.200808 kubelet[2835]: E0213 19:22:41.200669 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.200808 kubelet[2835]: E0213 19:22:41.200715 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:41.200808 kubelet[2835]: E0213 19:22:41.200728 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8xksv" Feb 13 19:22:41.200907 kubelet[2835]: E0213 19:22:41.200761 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8xksv_calico-system(d4951e2a-1949-44bc-afb1-457c1decf801)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8xksv" podUID="d4951e2a-1949-44bc-afb1-457c1decf801" Feb 13 19:22:41.211678 containerd[1558]: time="2025-02-13T19:22:41.211644984Z" level=error msg="Failed to destroy network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.212053 containerd[1558]: time="2025-02-13T19:22:41.212038011Z" level=error msg="encountered an error cleaning up failed sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.212135 containerd[1558]: time="2025-02-13T19:22:41.212123431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.212487 kubelet[2835]: E0213 19:22:41.212371 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.212487 kubelet[2835]: E0213 19:22:41.212408 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:41.212487 kubelet[2835]: E0213 19:22:41.212424 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" Feb 13 19:22:41.212574 kubelet[2835]: E0213 19:22:41.212453 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b645f959d-hxt68_calico-system(831dd8f9-aeed-441d-a3bb-5e45b59bb4ba)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" podUID="831dd8f9-aeed-441d-a3bb-5e45b59bb4ba" Feb 13 19:22:41.213459 containerd[1558]: time="2025-02-13T19:22:41.213421279Z" level=error msg="Failed to destroy network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.213648 containerd[1558]: time="2025-02-13T19:22:41.213632793Z" level=error msg="encountered an error cleaning up failed sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.213679 containerd[1558]: time="2025-02-13T19:22:41.213662516Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.213783 kubelet[2835]: E0213 19:22:41.213768 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.213853 kubelet[2835]: E0213 19:22:41.213792 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:41.213853 kubelet[2835]: E0213 19:22:41.213805 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" Feb 13 19:22:41.213853 kubelet[2835]: E0213 19:22:41.213840 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-r4fxw_calico-apiserver(cda0b9c0-3854-44b6-bafc-261c79251f6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" podUID="cda0b9c0-3854-44b6-bafc-261c79251f6a" Feb 13 19:22:41.216226 containerd[1558]: time="2025-02-13T19:22:41.216205361Z" level=error msg="Failed to destroy network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.216374 containerd[1558]: time="2025-02-13T19:22:41.216357350Z" level=error msg="encountered an error cleaning up failed sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.216402 containerd[1558]: time="2025-02-13T19:22:41.216387412Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.216561 kubelet[2835]: E0213 19:22:41.216467 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.216561 kubelet[2835]: E0213 19:22:41.216487 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:41.216561 kubelet[2835]: E0213 19:22:41.216500 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" Feb 13 19:22:41.216643 kubelet[2835]: E0213 19:22:41.216521 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64b59d7dcb-8vjgc_calico-apiserver(40b4dc54-80eb-4f55-8dea-43e4da051ec7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" podUID="40b4dc54-80eb-4f55-8dea-43e4da051ec7" Feb 13 19:22:41.253658 containerd[1558]: time="2025-02-13T19:22:41.253590272Z" level=error msg="Failed to destroy network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.253970 containerd[1558]: time="2025-02-13T19:22:41.253881395Z" level=error msg="encountered an error cleaning up failed sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.253970 containerd[1558]: time="2025-02-13T19:22:41.253917181Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.254052 kubelet[2835]: E0213 19:22:41.254022 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:41.254094 kubelet[2835]: E0213 19:22:41.254051 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:41.254094 kubelet[2835]: E0213 19:22:41.254065 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:41.254139 kubelet[2835]: E0213 19:22:41.254090 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dfpnq" podUID="c028a42e-4dac-4c57-bfb3-5f11422685ac" Feb 13 19:22:41.291210 containerd[1558]: time="2025-02-13T19:22:41.291156738Z" level=info msg="CreateContainer within sandbox \"3de51ee35f84105c6cdf94418de2cf8c36975c0bfcf2b37647d4c3cb99f81b76\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"5306a1bb9aecf6f51c15945f39d2e22bd63ed8ccfd5086dd475a4963a876cc98\"" Feb 13 19:22:41.291544 containerd[1558]: time="2025-02-13T19:22:41.291494125Z" level=info msg="StartContainer for \"5306a1bb9aecf6f51c15945f39d2e22bd63ed8ccfd5086dd475a4963a876cc98\"" Feb 13 19:22:41.423963 systemd[1]: Started cri-containerd-5306a1bb9aecf6f51c15945f39d2e22bd63ed8ccfd5086dd475a4963a876cc98.scope - libcontainer container 5306a1bb9aecf6f51c15945f39d2e22bd63ed8ccfd5086dd475a4963a876cc98. Feb 13 19:22:41.451735 containerd[1558]: time="2025-02-13T19:22:41.451708801Z" level=info msg="StartContainer for \"5306a1bb9aecf6f51c15945f39d2e22bd63ed8ccfd5086dd475a4963a876cc98\" returns successfully" Feb 13 19:22:41.735388 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47-shm.mount: Deactivated successfully. Feb 13 19:22:41.735450 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513-shm.mount: Deactivated successfully. Feb 13 19:22:41.735490 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36-shm.mount: Deactivated successfully. Feb 13 19:22:41.750260 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 19:22:41.753898 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 19:22:41.832994 kubelet[2835]: I0213 19:22:41.832946 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513" Feb 13 19:22:41.833758 containerd[1558]: time="2025-02-13T19:22:41.833739299Z" level=info msg="StopPodSandbox for \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\"" Feb 13 19:22:41.834571 kubelet[2835]: I0213 19:22:41.834530 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0" Feb 13 19:22:41.834788 containerd[1558]: time="2025-02-13T19:22:41.834768072Z" level=info msg="StopPodSandbox for \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\"" Feb 13 19:22:41.834941 containerd[1558]: time="2025-02-13T19:22:41.834925554Z" level=info msg="Ensure that sandbox 61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0 in task-service has been cleanup successfully" Feb 13 19:22:41.835494 containerd[1558]: time="2025-02-13T19:22:41.835108109Z" level=info msg="TearDown network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" successfully" Feb 13 19:22:41.835494 containerd[1558]: time="2025-02-13T19:22:41.835136143Z" level=info msg="StopPodSandbox for \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" returns successfully" Feb 13 19:22:41.835494 containerd[1558]: time="2025-02-13T19:22:41.835397243Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\"" Feb 13 19:22:41.835662 containerd[1558]: time="2025-02-13T19:22:41.835640974Z" level=info msg="TearDown network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" successfully" Feb 13 19:22:41.835662 containerd[1558]: time="2025-02-13T19:22:41.835659419Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" returns successfully" Feb 13 19:22:41.838064 containerd[1558]: time="2025-02-13T19:22:41.836009849Z" level=info msg="Ensure that sandbox f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513 in task-service has been cleanup successfully" Feb 13 19:22:41.837918 systemd[1]: run-netns-cni\x2d81da17a3\x2db9ea\x2db85d\x2d782a\x2d821905a32e99.mount: Deactivated successfully. Feb 13 19:22:41.837977 systemd[1]: run-netns-cni\x2dec071d00\x2d5901\x2d115a\x2d7c87\x2dd1664b6efb8d.mount: Deactivated successfully. Feb 13 19:22:41.840823 containerd[1558]: time="2025-02-13T19:22:41.840105188Z" level=info msg="TearDown network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\" successfully" Feb 13 19:22:41.840823 containerd[1558]: time="2025-02-13T19:22:41.840125636Z" level=info msg="StopPodSandbox for \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\" returns successfully" Feb 13 19:22:41.840823 containerd[1558]: time="2025-02-13T19:22:41.840767193Z" level=info msg="StopPodSandbox for \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\"" Feb 13 19:22:41.841177 containerd[1558]: time="2025-02-13T19:22:41.841160096Z" level=info msg="TearDown network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" successfully" Feb 13 19:22:41.841315 containerd[1558]: time="2025-02-13T19:22:41.841217209Z" level=info msg="StopPodSandbox for \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" returns successfully" Feb 13 19:22:41.841505 containerd[1558]: time="2025-02-13T19:22:41.841491800Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" Feb 13 19:22:41.841566 containerd[1558]: time="2025-02-13T19:22:41.841554265Z" level=info msg="TearDown network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" successfully" Feb 13 19:22:41.841566 containerd[1558]: time="2025-02-13T19:22:41.841563271Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" returns successfully" Feb 13 19:22:41.841854 containerd[1558]: time="2025-02-13T19:22:41.841800809Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\"" Feb 13 19:22:41.842137 containerd[1558]: time="2025-02-13T19:22:41.841885469Z" level=info msg="TearDown network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" successfully" Feb 13 19:22:41.842137 containerd[1558]: time="2025-02-13T19:22:41.841912920Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" returns successfully" Feb 13 19:22:41.842693 containerd[1558]: time="2025-02-13T19:22:41.842598594Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:22:41.842693 containerd[1558]: time="2025-02-13T19:22:41.842664393Z" level=info msg="TearDown network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" successfully" Feb 13 19:22:41.842887 containerd[1558]: time="2025-02-13T19:22:41.842775272Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" returns successfully" Feb 13 19:22:41.843333 containerd[1558]: time="2025-02-13T19:22:41.843237462Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" Feb 13 19:22:41.843522 containerd[1558]: time="2025-02-13T19:22:41.843430029Z" level=info msg="TearDown network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" successfully" Feb 13 19:22:41.843836 containerd[1558]: time="2025-02-13T19:22:41.843799935Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" returns successfully" Feb 13 19:22:41.843881 containerd[1558]: time="2025-02-13T19:22:41.843688816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:4,}" Feb 13 19:22:41.845355 containerd[1558]: time="2025-02-13T19:22:41.844649479Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:22:41.845355 containerd[1558]: time="2025-02-13T19:22:41.844697510Z" level=info msg="TearDown network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" successfully" Feb 13 19:22:41.845355 containerd[1558]: time="2025-02-13T19:22:41.844707329Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" returns successfully" Feb 13 19:22:41.845355 containerd[1558]: time="2025-02-13T19:22:41.845107178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:5,}" Feb 13 19:22:41.845479 kubelet[2835]: I0213 19:22:41.845017 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379" Feb 13 19:22:41.845666 containerd[1558]: time="2025-02-13T19:22:41.845655665Z" level=info msg="StopPodSandbox for \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\"" Feb 13 19:22:41.846358 containerd[1558]: time="2025-02-13T19:22:41.846345806Z" level=info msg="Ensure that sandbox acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379 in task-service has been cleanup successfully" Feb 13 19:22:41.846799 containerd[1558]: time="2025-02-13T19:22:41.846788710Z" level=info msg="TearDown network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\" successfully" Feb 13 19:22:41.847050 containerd[1558]: time="2025-02-13T19:22:41.847036959Z" level=info msg="StopPodSandbox for \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\" returns successfully" Feb 13 19:22:41.848444 systemd[1]: run-netns-cni\x2d36d2ac1d\x2d39af\x2de59c\x2de794\x2de81b16188c2a.mount: Deactivated successfully. Feb 13 19:22:41.851773 containerd[1558]: time="2025-02-13T19:22:41.851645166Z" level=info msg="StopPodSandbox for \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\"" Feb 13 19:22:41.852090 containerd[1558]: time="2025-02-13T19:22:41.852043148Z" level=info msg="TearDown network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" successfully" Feb 13 19:22:41.852090 containerd[1558]: time="2025-02-13T19:22:41.852088340Z" level=info msg="StopPodSandbox for \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" returns successfully" Feb 13 19:22:41.852407 containerd[1558]: time="2025-02-13T19:22:41.852392878Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\"" Feb 13 19:22:41.852831 containerd[1558]: time="2025-02-13T19:22:41.852451126Z" level=info msg="TearDown network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" successfully" Feb 13 19:22:41.852831 containerd[1558]: time="2025-02-13T19:22:41.852460555Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" returns successfully" Feb 13 19:22:41.853667 containerd[1558]: time="2025-02-13T19:22:41.853394687Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" Feb 13 19:22:41.853938 containerd[1558]: time="2025-02-13T19:22:41.853849379Z" level=info msg="TearDown network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" successfully" Feb 13 19:22:41.854121 containerd[1558]: time="2025-02-13T19:22:41.854106937Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" returns successfully" Feb 13 19:22:41.855084 containerd[1558]: time="2025-02-13T19:22:41.854587432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:4,}" Feb 13 19:22:41.856044 kubelet[2835]: I0213 19:22:41.855946 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47" Feb 13 19:22:41.856547 containerd[1558]: time="2025-02-13T19:22:41.856511568Z" level=info msg="StopPodSandbox for \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\"" Feb 13 19:22:41.856632 containerd[1558]: time="2025-02-13T19:22:41.856620020Z" level=info msg="Ensure that sandbox 72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47 in task-service has been cleanup successfully" Feb 13 19:22:41.858117 systemd[1]: run-netns-cni\x2d899006d2\x2d296f\x2db988\x2d6cac\x2d83c7d236dc9d.mount: Deactivated successfully. Feb 13 19:22:41.859463 containerd[1558]: time="2025-02-13T19:22:41.858661526Z" level=info msg="TearDown network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\" successfully" Feb 13 19:22:41.859463 containerd[1558]: time="2025-02-13T19:22:41.858691702Z" level=info msg="StopPodSandbox for \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\" returns successfully" Feb 13 19:22:41.860790 containerd[1558]: time="2025-02-13T19:22:41.860524235Z" level=info msg="StopPodSandbox for \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\"" Feb 13 19:22:41.860790 containerd[1558]: time="2025-02-13T19:22:41.860574121Z" level=info msg="TearDown network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" successfully" Feb 13 19:22:41.860790 containerd[1558]: time="2025-02-13T19:22:41.860593182Z" level=info msg="StopPodSandbox for \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" returns successfully" Feb 13 19:22:41.862103 containerd[1558]: time="2025-02-13T19:22:41.862081369Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\"" Feb 13 19:22:41.862152 containerd[1558]: time="2025-02-13T19:22:41.862132116Z" level=info msg="TearDown network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" successfully" Feb 13 19:22:41.862152 containerd[1558]: time="2025-02-13T19:22:41.862138491Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" returns successfully" Feb 13 19:22:41.864729 containerd[1558]: time="2025-02-13T19:22:41.862971336Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" Feb 13 19:22:41.864729 containerd[1558]: time="2025-02-13T19:22:41.863032534Z" level=info msg="TearDown network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" successfully" Feb 13 19:22:41.864729 containerd[1558]: time="2025-02-13T19:22:41.863040532Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" returns successfully" Feb 13 19:22:41.864729 containerd[1558]: time="2025-02-13T19:22:41.863201518Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:22:41.864729 containerd[1558]: time="2025-02-13T19:22:41.863356023Z" level=info msg="TearDown network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" successfully" Feb 13 19:22:41.864729 containerd[1558]: time="2025-02-13T19:22:41.863363657Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" returns successfully" Feb 13 19:22:41.864729 containerd[1558]: time="2025-02-13T19:22:41.863792176Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:5,}" Feb 13 19:22:41.865889 kubelet[2835]: I0213 19:22:41.865277 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36" Feb 13 19:22:41.866166 containerd[1558]: time="2025-02-13T19:22:41.866064086Z" level=info msg="StopPodSandbox for \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\"" Feb 13 19:22:41.869276 containerd[1558]: time="2025-02-13T19:22:41.869170728Z" level=info msg="Ensure that sandbox 0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36 in task-service has been cleanup successfully" Feb 13 19:22:41.869394 containerd[1558]: time="2025-02-13T19:22:41.869383225Z" level=info msg="TearDown network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\" successfully" Feb 13 19:22:41.869740 containerd[1558]: time="2025-02-13T19:22:41.869516678Z" level=info msg="StopPodSandbox for \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\" returns successfully" Feb 13 19:22:41.870902 containerd[1558]: time="2025-02-13T19:22:41.870070288Z" level=info msg="StopPodSandbox for \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\"" Feb 13 19:22:41.870902 containerd[1558]: time="2025-02-13T19:22:41.870130463Z" level=info msg="TearDown network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" successfully" Feb 13 19:22:41.870902 containerd[1558]: time="2025-02-13T19:22:41.870168085Z" level=info msg="StopPodSandbox for \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" returns successfully" Feb 13 19:22:41.870902 containerd[1558]: time="2025-02-13T19:22:41.870432803Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\"" Feb 13 19:22:41.870902 containerd[1558]: time="2025-02-13T19:22:41.870569807Z" level=info msg="TearDown network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" successfully" Feb 13 19:22:41.870902 containerd[1558]: time="2025-02-13T19:22:41.870577521Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" returns successfully" Feb 13 19:22:41.872278 containerd[1558]: time="2025-02-13T19:22:41.871390328Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" Feb 13 19:22:41.872278 containerd[1558]: time="2025-02-13T19:22:41.871971338Z" level=info msg="TearDown network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" successfully" Feb 13 19:22:41.872278 containerd[1558]: time="2025-02-13T19:22:41.871978981Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" returns successfully" Feb 13 19:22:41.872362 containerd[1558]: time="2025-02-13T19:22:41.872343295Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:22:41.872629 containerd[1558]: time="2025-02-13T19:22:41.872382045Z" level=info msg="TearDown network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" successfully" Feb 13 19:22:41.872629 containerd[1558]: time="2025-02-13T19:22:41.872391254Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" returns successfully" Feb 13 19:22:41.872494 systemd[1]: run-netns-cni\x2d581e9483\x2d9869\x2d6595\x2db78c\x2d805aa3fd47ed.mount: Deactivated successfully. Feb 13 19:22:41.872718 kubelet[2835]: I0213 19:22:41.872447 2835 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271" Feb 13 19:22:41.873829 containerd[1558]: time="2025-02-13T19:22:41.873007312Z" level=info msg="StopPodSandbox for \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\"" Feb 13 19:22:41.873829 containerd[1558]: time="2025-02-13T19:22:41.873172633Z" level=info msg="Ensure that sandbox e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271 in task-service has been cleanup successfully" Feb 13 19:22:41.873829 containerd[1558]: time="2025-02-13T19:22:41.873332107Z" level=info msg="TearDown network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\" successfully" Feb 13 19:22:41.873829 containerd[1558]: time="2025-02-13T19:22:41.873342975Z" level=info msg="StopPodSandbox for \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\" returns successfully" Feb 13 19:22:41.873829 containerd[1558]: time="2025-02-13T19:22:41.873455317Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:5,}" Feb 13 19:22:41.875281 containerd[1558]: time="2025-02-13T19:22:41.874995952Z" level=info msg="StopPodSandbox for \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\"" Feb 13 19:22:41.875281 containerd[1558]: time="2025-02-13T19:22:41.875056565Z" level=info msg="TearDown network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" successfully" Feb 13 19:22:41.875281 containerd[1558]: time="2025-02-13T19:22:41.875065105Z" level=info msg="StopPodSandbox for \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" returns successfully" Feb 13 19:22:41.875829 containerd[1558]: time="2025-02-13T19:22:41.875752854Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\"" Feb 13 19:22:41.876007 containerd[1558]: time="2025-02-13T19:22:41.875992019Z" level=info msg="TearDown network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" successfully" Feb 13 19:22:41.876007 containerd[1558]: time="2025-02-13T19:22:41.876003237Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" returns successfully" Feb 13 19:22:41.876374 containerd[1558]: time="2025-02-13T19:22:41.876358732Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" Feb 13 19:22:41.876413 containerd[1558]: time="2025-02-13T19:22:41.876401747Z" level=info msg="TearDown network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" successfully" Feb 13 19:22:41.876413 containerd[1558]: time="2025-02-13T19:22:41.876410540Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" returns successfully" Feb 13 19:22:41.877110 containerd[1558]: time="2025-02-13T19:22:41.876972595Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:22:41.877110 containerd[1558]: time="2025-02-13T19:22:41.877016515Z" level=info msg="TearDown network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" successfully" Feb 13 19:22:41.877110 containerd[1558]: time="2025-02-13T19:22:41.877022437Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" returns successfully" Feb 13 19:22:41.878187 containerd[1558]: time="2025-02-13T19:22:41.878037810Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:5,}" Feb 13 19:22:41.988494 kubelet[2835]: I0213 19:22:41.959894 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-ls26c" podStartSLOduration=1.5026643389999998 podStartE2EDuration="21.900584793s" podCreationTimestamp="2025-02-13 19:22:20 +0000 UTC" firstStartedPulling="2025-02-13 19:22:20.748964822 +0000 UTC m=+13.058165447" lastFinishedPulling="2025-02-13 19:22:41.146885273 +0000 UTC m=+33.456085901" observedRunningTime="2025-02-13 19:22:41.898671738 +0000 UTC m=+34.207872374" watchObservedRunningTime="2025-02-13 19:22:41.900584793 +0000 UTC m=+34.209785429" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.046 [INFO][4573] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.046 [INFO][4573] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" iface="eth0" netns="/var/run/netns/cni-1d6ab45f-86b9-54eb-a945-2c8cfc411111" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.047 [INFO][4573] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" iface="eth0" netns="/var/run/netns/cni-1d6ab45f-86b9-54eb-a945-2c8cfc411111" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.052 [INFO][4573] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" iface="eth0" netns="/var/run/netns/cni-1d6ab45f-86b9-54eb-a945-2c8cfc411111" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.052 [INFO][4573] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.052 [INFO][4573] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.379 [INFO][4619] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" HandleID="k8s-pod-network.71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" Workload="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.380 [INFO][4619] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.381 [INFO][4619] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.393 [WARNING][4619] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" HandleID="k8s-pod-network.71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" Workload="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.393 [INFO][4619] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" HandleID="k8s-pod-network.71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" Workload="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.395 [INFO][4619] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:22:42.401521 containerd[1558]: 2025-02-13 19:22:42.400 [INFO][4573] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c" Feb 13 19:22:42.403977 containerd[1558]: time="2025-02-13T19:22:42.403935209Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:42.404143 kubelet[2835]: E0213 19:22:42.404119 2835 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:22:42.404347 kubelet[2835]: E0213 19:22:42.404161 2835 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:42.404347 kubelet[2835]: E0213 19:22:42.404192 2835 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-dfpnq" Feb 13 19:22:42.404347 kubelet[2835]: E0213 19:22:42.404224 2835 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-dfpnq_kube-system(c028a42e-4dac-4c57-bfb3-5f11422685ac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"71c023f3d82a5f41da52b8ba9a2c412fd49b8b6a0132e9b038cecc17a978308c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-dfpnq" podUID="c028a42e-4dac-4c57-bfb3-5f11422685ac" Feb 13 19:22:42.534969 systemd-networkd[1459]: cali22f6eb1a933: Link UP Feb 13 19:22:42.535092 systemd-networkd[1459]: cali22f6eb1a933: Gained carrier Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:41.980 [INFO][4531] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.032 [INFO][4531] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0 calico-kube-controllers-b645f959d- calico-system 831dd8f9-aeed-441d-a3bb-5e45b59bb4ba 676 0 2025-02-13 19:22:20 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:b645f959d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-b645f959d-hxt68 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali22f6eb1a933 [] []}} ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.033 [INFO][4531] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.379 [INFO][4624] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" HandleID="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Workload="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.396 [INFO][4624] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" HandleID="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Workload="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000519d0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-b645f959d-hxt68", "timestamp":"2025-02-13 19:22:42.379352756 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.397 [INFO][4624] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.397 [INFO][4624] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.397 [INFO][4624] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.400 [INFO][4624] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.494 [INFO][4624] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.496 [INFO][4624] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.497 [INFO][4624] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.498 [INFO][4624] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.498 [INFO][4624] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.498 [INFO][4624] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.504 [INFO][4624] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.514 [INFO][4624] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.514 [INFO][4624] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" host="localhost" Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.514 [INFO][4624] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:22:42.542297 containerd[1558]: 2025-02-13 19:22:42.514 [INFO][4624] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" HandleID="k8s-pod-network.23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Workload="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" Feb 13 19:22:42.544149 containerd[1558]: 2025-02-13 19:22:42.517 [INFO][4531] cni-plugin/k8s.go 386: Populated endpoint ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0", GenerateName:"calico-kube-controllers-b645f959d-", Namespace:"calico-system", SelfLink:"", UID:"831dd8f9-aeed-441d-a3bb-5e45b59bb4ba", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b645f959d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-b645f959d-hxt68", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali22f6eb1a933", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.544149 containerd[1558]: 2025-02-13 19:22:42.517 [INFO][4531] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" Feb 13 19:22:42.544149 containerd[1558]: 2025-02-13 19:22:42.517 [INFO][4531] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali22f6eb1a933 ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" Feb 13 19:22:42.544149 containerd[1558]: 2025-02-13 19:22:42.533 [INFO][4531] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" Feb 13 19:22:42.544149 containerd[1558]: 2025-02-13 19:22:42.533 [INFO][4531] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0", GenerateName:"calico-kube-controllers-b645f959d-", Namespace:"calico-system", SelfLink:"", UID:"831dd8f9-aeed-441d-a3bb-5e45b59bb4ba", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b645f959d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c", Pod:"calico-kube-controllers-b645f959d-hxt68", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali22f6eb1a933", MAC:"ae:b2:d0:7f:14:98", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.544149 containerd[1558]: 2025-02-13 19:22:42.539 [INFO][4531] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c" Namespace="calico-system" Pod="calico-kube-controllers-b645f959d-hxt68" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--b645f959d--hxt68-eth0" Feb 13 19:22:42.578292 containerd[1558]: time="2025-02-13T19:22:42.575742772Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:42.578292 containerd[1558]: time="2025-02-13T19:22:42.578223903Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:42.578292 containerd[1558]: time="2025-02-13T19:22:42.578232475Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.578493 containerd[1558]: time="2025-02-13T19:22:42.578358362Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.590935 systemd[1]: Started cri-containerd-23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c.scope - libcontainer container 23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c. Feb 13 19:22:42.605223 systemd-resolved[1460]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:22:42.621197 systemd-networkd[1459]: calia70eac9f65d: Link UP Feb 13 19:22:42.623684 systemd-networkd[1459]: calia70eac9f65d: Gained carrier Feb 13 19:22:42.646337 containerd[1558]: time="2025-02-13T19:22:42.646309119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b645f959d-hxt68,Uid:831dd8f9-aeed-441d-a3bb-5e45b59bb4ba,Namespace:calico-system,Attempt:5,} returns sandbox id \"23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c\"" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:41.964 [INFO][4577] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.033 [INFO][4577] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0 calico-apiserver-64b59d7dcb- calico-apiserver cda0b9c0-3854-44b6-bafc-261c79251f6a 681 0 2025-02-13 19:22:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64b59d7dcb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64b59d7dcb-r4fxw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia70eac9f65d [] []}} ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.033 [INFO][4577] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.380 [INFO][4623] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" HandleID="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Workload="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.397 [INFO][4623] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" HandleID="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Workload="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003a1220), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64b59d7dcb-r4fxw", "timestamp":"2025-02-13 19:22:42.380023503 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.397 [INFO][4623] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.514 [INFO][4623] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.514 [INFO][4623] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.516 [INFO][4623] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.595 [INFO][4623] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.599 [INFO][4623] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.600 [INFO][4623] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.602 [INFO][4623] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.602 [INFO][4623] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.603 [INFO][4623] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1 Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.610 [INFO][4623] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.615 [INFO][4623] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.615 [INFO][4623] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" host="localhost" Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.615 [INFO][4623] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:22:42.654091 containerd[1558]: 2025-02-13 19:22:42.615 [INFO][4623] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" HandleID="k8s-pod-network.e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Workload="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" Feb 13 19:22:42.654539 containerd[1558]: 2025-02-13 19:22:42.617 [INFO][4577] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0", GenerateName:"calico-apiserver-64b59d7dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"cda0b9c0-3854-44b6-bafc-261c79251f6a", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64b59d7dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64b59d7dcb-r4fxw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia70eac9f65d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.654539 containerd[1558]: 2025-02-13 19:22:42.617 [INFO][4577] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" Feb 13 19:22:42.654539 containerd[1558]: 2025-02-13 19:22:42.617 [INFO][4577] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia70eac9f65d ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" Feb 13 19:22:42.654539 containerd[1558]: 2025-02-13 19:22:42.625 [INFO][4577] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" Feb 13 19:22:42.654539 containerd[1558]: 2025-02-13 19:22:42.625 [INFO][4577] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0", GenerateName:"calico-apiserver-64b59d7dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"cda0b9c0-3854-44b6-bafc-261c79251f6a", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64b59d7dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1", Pod:"calico-apiserver-64b59d7dcb-r4fxw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia70eac9f65d", MAC:"de:cd:3d:69:38:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.654539 containerd[1558]: 2025-02-13 19:22:42.650 [INFO][4577] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-r4fxw" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--r4fxw-eth0" Feb 13 19:22:42.659867 containerd[1558]: time="2025-02-13T19:22:42.659408231Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Feb 13 19:22:42.683944 containerd[1558]: time="2025-02-13T19:22:42.683884253Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:42.683944 containerd[1558]: time="2025-02-13T19:22:42.683918849Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:42.683944 containerd[1558]: time="2025-02-13T19:22:42.683932645Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.685893 containerd[1558]: time="2025-02-13T19:22:42.683984920Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.708182 systemd[1]: Started cri-containerd-e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1.scope - libcontainer container e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1. Feb 13 19:22:42.728444 systemd-networkd[1459]: cali44d769c1abc: Link UP Feb 13 19:22:42.729873 systemd-networkd[1459]: cali44d769c1abc: Gained carrier Feb 13 19:22:42.740666 systemd[1]: run-netns-cni\x2d2c4d0ed5\x2da4cc\x2d3010\x2d3729\x2d9a03f1bccdb0.mount: Deactivated successfully. Feb 13 19:22:42.750718 systemd-resolved[1460]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:41.976 [INFO][4546] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.033 [INFO][4546] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--wvctv-eth0 coredns-668d6bf9bc- kube-system 6f8fe6ce-d949-403c-a86a-82a4773819d5 680 0 2025-02-13 19:22:15 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-wvctv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali44d769c1abc [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.033 [INFO][4546] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.379 [INFO][4621] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" HandleID="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Workload="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.397 [INFO][4621] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" HandleID="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Workload="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002aaf10), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-wvctv", "timestamp":"2025-02-13 19:22:42.379070106 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.397 [INFO][4621] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.616 [INFO][4621] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.618 [INFO][4621] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.621 [INFO][4621] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.704 [INFO][4621] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.708 [INFO][4621] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.711 [INFO][4621] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.713 [INFO][4621] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.713 [INFO][4621] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.716 [INFO][4621] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191 Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.719 [INFO][4621] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.722 [INFO][4621] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.722 [INFO][4621] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" host="localhost" Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.722 [INFO][4621] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:22:42.756223 containerd[1558]: 2025-02-13 19:22:42.723 [INFO][4621] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" HandleID="k8s-pod-network.e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Workload="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" Feb 13 19:22:42.756674 containerd[1558]: 2025-02-13 19:22:42.725 [INFO][4546] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--wvctv-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6f8fe6ce-d949-403c-a86a-82a4773819d5", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-wvctv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44d769c1abc", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.756674 containerd[1558]: 2025-02-13 19:22:42.725 [INFO][4546] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" Feb 13 19:22:42.756674 containerd[1558]: 2025-02-13 19:22:42.725 [INFO][4546] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali44d769c1abc ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" Feb 13 19:22:42.756674 containerd[1558]: 2025-02-13 19:22:42.732 [INFO][4546] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" Feb 13 19:22:42.756674 containerd[1558]: 2025-02-13 19:22:42.739 [INFO][4546] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--wvctv-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6f8fe6ce-d949-403c-a86a-82a4773819d5", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191", Pod:"coredns-668d6bf9bc-wvctv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali44d769c1abc", MAC:"ca:0b:f7:d4:91:ac", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.756674 containerd[1558]: 2025-02-13 19:22:42.754 [INFO][4546] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191" Namespace="kube-system" Pod="coredns-668d6bf9bc-wvctv" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--wvctv-eth0" Feb 13 19:22:42.778067 containerd[1558]: time="2025-02-13T19:22:42.777952091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-r4fxw,Uid:cda0b9c0-3854-44b6-bafc-261c79251f6a,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1\"" Feb 13 19:22:42.781548 containerd[1558]: time="2025-02-13T19:22:42.781493058Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:42.781781 containerd[1558]: time="2025-02-13T19:22:42.781638366Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:42.781912 containerd[1558]: time="2025-02-13T19:22:42.781868894Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.782355 containerd[1558]: time="2025-02-13T19:22:42.782337379Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.796922 systemd[1]: Started cri-containerd-e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191.scope - libcontainer container e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191. Feb 13 19:22:42.806702 systemd-resolved[1460]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:22:42.824150 systemd-networkd[1459]: calid438bcf2ab0: Link UP Feb 13 19:22:42.824284 systemd-networkd[1459]: calid438bcf2ab0: Gained carrier Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:41.987 [INFO][4564] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.034 [INFO][4564] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0 calico-apiserver-64b59d7dcb- calico-apiserver 40b4dc54-80eb-4f55-8dea-43e4da051ec7 679 0 2025-02-13 19:22:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64b59d7dcb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64b59d7dcb-8vjgc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid438bcf2ab0 [] []}} ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.034 [INFO][4564] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.379 [INFO][4622] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" HandleID="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Workload="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.399 [INFO][4622] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" HandleID="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Workload="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00041a7a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64b59d7dcb-8vjgc", "timestamp":"2025-02-13 19:22:42.379604615 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.400 [INFO][4622] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.722 [INFO][4622] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.722 [INFO][4622] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.725 [INFO][4622] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.800 [INFO][4622] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.809 [INFO][4622] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.810 [INFO][4622] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.811 [INFO][4622] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.811 [INFO][4622] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.812 [INFO][4622] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7 Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.815 [INFO][4622] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.818 [INFO][4622] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.819 [INFO][4622] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" host="localhost" Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.819 [INFO][4622] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:22:42.836424 containerd[1558]: 2025-02-13 19:22:42.819 [INFO][4622] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" HandleID="k8s-pod-network.0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Workload="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" Feb 13 19:22:42.836947 containerd[1558]: 2025-02-13 19:22:42.821 [INFO][4564] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0", GenerateName:"calico-apiserver-64b59d7dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"40b4dc54-80eb-4f55-8dea-43e4da051ec7", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64b59d7dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64b59d7dcb-8vjgc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid438bcf2ab0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.836947 containerd[1558]: 2025-02-13 19:22:42.821 [INFO][4564] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" Feb 13 19:22:42.836947 containerd[1558]: 2025-02-13 19:22:42.821 [INFO][4564] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid438bcf2ab0 ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" Feb 13 19:22:42.836947 containerd[1558]: 2025-02-13 19:22:42.823 [INFO][4564] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" Feb 13 19:22:42.836947 containerd[1558]: 2025-02-13 19:22:42.823 [INFO][4564] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0", GenerateName:"calico-apiserver-64b59d7dcb-", Namespace:"calico-apiserver", SelfLink:"", UID:"40b4dc54-80eb-4f55-8dea-43e4da051ec7", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64b59d7dcb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7", Pod:"calico-apiserver-64b59d7dcb-8vjgc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid438bcf2ab0", MAC:"ce:f4:6f:45:78:af", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.836947 containerd[1558]: 2025-02-13 19:22:42.834 [INFO][4564] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7" Namespace="calico-apiserver" Pod="calico-apiserver-64b59d7dcb-8vjgc" WorkloadEndpoint="localhost-k8s-calico--apiserver--64b59d7dcb--8vjgc-eth0" Feb 13 19:22:42.840147 containerd[1558]: time="2025-02-13T19:22:42.839657952Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-wvctv,Uid:6f8fe6ce-d949-403c-a86a-82a4773819d5,Namespace:kube-system,Attempt:4,} returns sandbox id \"e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191\"" Feb 13 19:22:42.842411 containerd[1558]: time="2025-02-13T19:22:42.842391293Z" level=info msg="CreateContainer within sandbox \"e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 19:22:42.852029 containerd[1558]: time="2025-02-13T19:22:42.851944054Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:42.852132 containerd[1558]: time="2025-02-13T19:22:42.852010118Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:42.852159 containerd[1558]: time="2025-02-13T19:22:42.852120158Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.852256 containerd[1558]: time="2025-02-13T19:22:42.852226859Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.860793 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1313904090.mount: Deactivated successfully. Feb 13 19:22:42.863576 containerd[1558]: time="2025-02-13T19:22:42.863552032Z" level=info msg="CreateContainer within sandbox \"e567e4f58e137c0ff0e8fd40cc09a412b180c81f083cf10bab62f796d5816191\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"daca88483921390b8c77af5b91a76772e92ecd37dda16f8cb1fd411ad58f9827\"" Feb 13 19:22:42.865088 containerd[1558]: time="2025-02-13T19:22:42.865015576Z" level=info msg="StartContainer for \"daca88483921390b8c77af5b91a76772e92ecd37dda16f8cb1fd411ad58f9827\"" Feb 13 19:22:42.873978 systemd[1]: Started cri-containerd-0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7.scope - libcontainer container 0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7. Feb 13 19:22:42.887403 systemd-resolved[1460]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:22:42.888921 systemd[1]: Started cri-containerd-daca88483921390b8c77af5b91a76772e92ecd37dda16f8cb1fd411ad58f9827.scope - libcontainer container daca88483921390b8c77af5b91a76772e92ecd37dda16f8cb1fd411ad58f9827. Feb 13 19:22:42.890849 containerd[1558]: time="2025-02-13T19:22:42.890691751Z" level=info msg="StopPodSandbox for \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\"" Feb 13 19:22:42.890849 containerd[1558]: time="2025-02-13T19:22:42.890796274Z" level=info msg="TearDown network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" successfully" Feb 13 19:22:42.890849 containerd[1558]: time="2025-02-13T19:22:42.890804405Z" level=info msg="StopPodSandbox for \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" returns successfully" Feb 13 19:22:42.892301 containerd[1558]: time="2025-02-13T19:22:42.891093280Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\"" Feb 13 19:22:42.892301 containerd[1558]: time="2025-02-13T19:22:42.891338072Z" level=info msg="TearDown network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" successfully" Feb 13 19:22:42.892301 containerd[1558]: time="2025-02-13T19:22:42.891346747Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" returns successfully" Feb 13 19:22:42.892301 containerd[1558]: time="2025-02-13T19:22:42.891632308Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" Feb 13 19:22:42.892301 containerd[1558]: time="2025-02-13T19:22:42.892201755Z" level=info msg="TearDown network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" successfully" Feb 13 19:22:42.892301 containerd[1558]: time="2025-02-13T19:22:42.892211437Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" returns successfully" Feb 13 19:22:42.893578 containerd[1558]: time="2025-02-13T19:22:42.892539493Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:22:42.893578 containerd[1558]: time="2025-02-13T19:22:42.892578357Z" level=info msg="TearDown network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" successfully" Feb 13 19:22:42.893578 containerd[1558]: time="2025-02-13T19:22:42.892584109Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" returns successfully" Feb 13 19:22:42.893578 containerd[1558]: time="2025-02-13T19:22:42.893258282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:4,}" Feb 13 19:22:42.925717 containerd[1558]: time="2025-02-13T19:22:42.925652036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64b59d7dcb-8vjgc,Uid:40b4dc54-80eb-4f55-8dea-43e4da051ec7,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7\"" Feb 13 19:22:42.947150 containerd[1558]: time="2025-02-13T19:22:42.947122656Z" level=info msg="StartContainer for \"daca88483921390b8c77af5b91a76772e92ecd37dda16f8cb1fd411ad58f9827\" returns successfully" Feb 13 19:22:42.960280 systemd-networkd[1459]: cali1e1b6a760a5: Link UP Feb 13 19:22:42.962921 systemd-networkd[1459]: cali1e1b6a760a5: Gained carrier Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.007 [INFO][4592] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.034 [INFO][4592] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--8xksv-eth0 csi-node-driver- calico-system d4951e2a-1949-44bc-afb1-457c1decf801 594 0 2025-02-13 19:22:20 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:84cddb44f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-8xksv eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali1e1b6a760a5 [] []}} ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.034 [INFO][4592] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-eth0" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.379 [INFO][4620] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" HandleID="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Workload="localhost-k8s-csi--node--driver--8xksv-eth0" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.400 [INFO][4620] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" HandleID="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Workload="localhost-k8s-csi--node--driver--8xksv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ba220), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-8xksv", "timestamp":"2025-02-13 19:22:42.379930315 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.400 [INFO][4620] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.819 [INFO][4620] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.819 [INFO][4620] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.827 [INFO][4620] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.901 [INFO][4620] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.911 [INFO][4620] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.913 [INFO][4620] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.917 [INFO][4620] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.917 [INFO][4620] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.933 [INFO][4620] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8 Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.943 [INFO][4620] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.951 [INFO][4620] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.951 [INFO][4620] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" host="localhost" Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.951 [INFO][4620] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:22:42.976713 containerd[1558]: 2025-02-13 19:22:42.951 [INFO][4620] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" HandleID="k8s-pod-network.7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Workload="localhost-k8s-csi--node--driver--8xksv-eth0" Feb 13 19:22:42.977856 containerd[1558]: 2025-02-13 19:22:42.954 [INFO][4592] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--8xksv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d4951e2a-1949-44bc-afb1-457c1decf801", ResourceVersion:"594", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-8xksv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1e1b6a760a5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.977856 containerd[1558]: 2025-02-13 19:22:42.954 [INFO][4592] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-eth0" Feb 13 19:22:42.977856 containerd[1558]: 2025-02-13 19:22:42.954 [INFO][4592] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1e1b6a760a5 ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-eth0" Feb 13 19:22:42.977856 containerd[1558]: 2025-02-13 19:22:42.964 [INFO][4592] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-eth0" Feb 13 19:22:42.977856 containerd[1558]: 2025-02-13 19:22:42.964 [INFO][4592] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--8xksv-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"d4951e2a-1949-44bc-afb1-457c1decf801", ResourceVersion:"594", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8", Pod:"csi-node-driver-8xksv", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1e1b6a760a5", MAC:"b2:d0:f5:2e:22:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:42.977856 containerd[1558]: 2025-02-13 19:22:42.975 [INFO][4592] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8" Namespace="calico-system" Pod="csi-node-driver-8xksv" WorkloadEndpoint="localhost-k8s-csi--node--driver--8xksv-eth0" Feb 13 19:22:42.992709 containerd[1558]: time="2025-02-13T19:22:42.992623949Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:42.992780 containerd[1558]: time="2025-02-13T19:22:42.992723041Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:42.992780 containerd[1558]: time="2025-02-13T19:22:42.992741986Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:42.993189 containerd[1558]: time="2025-02-13T19:22:42.993121707Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:43.005956 systemd[1]: Started cri-containerd-7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8.scope - libcontainer container 7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8. Feb 13 19:22:43.019829 systemd-resolved[1460]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:22:43.033532 systemd-networkd[1459]: cali3abe538f69b: Link UP Feb 13 19:22:43.034216 systemd-networkd[1459]: cali3abe538f69b: Gained carrier Feb 13 19:22:43.037140 containerd[1558]: time="2025-02-13T19:22:43.037109216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8xksv,Uid:d4951e2a-1949-44bc-afb1-457c1decf801,Namespace:calico-system,Attempt:5,} returns sandbox id \"7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8\"" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:42.929 [INFO][4897] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:42.943 [INFO][4897] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0 coredns-668d6bf9bc- kube-system c028a42e-4dac-4c57-bfb3-5f11422685ac 767 0 2025-02-13 19:22:15 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-dfpnq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3abe538f69b [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:42.946 [INFO][4897] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:42.983 [INFO][4923] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" HandleID="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Workload="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.004 [INFO][4923] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" HandleID="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Workload="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002907f0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-dfpnq", "timestamp":"2025-02-13 19:22:42.982963988 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.004 [INFO][4923] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.004 [INFO][4923] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.004 [INFO][4923] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.005 [INFO][4923] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.008 [INFO][4923] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.012 [INFO][4923] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.013 [INFO][4923] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.015 [INFO][4923] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.015 [INFO][4923] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.016 [INFO][4923] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76 Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.019 [INFO][4923] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.025 [INFO][4923] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.025 [INFO][4923] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" host="localhost" Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.025 [INFO][4923] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:22:43.047561 containerd[1558]: 2025-02-13 19:22:43.025 [INFO][4923] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" HandleID="k8s-pod-network.8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Workload="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:43.048293 containerd[1558]: 2025-02-13 19:22:43.030 [INFO][4897] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c028a42e-4dac-4c57-bfb3-5f11422685ac", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-dfpnq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3abe538f69b", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:43.048293 containerd[1558]: 2025-02-13 19:22:43.030 [INFO][4897] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:43.048293 containerd[1558]: 2025-02-13 19:22:43.030 [INFO][4897] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3abe538f69b ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:43.048293 containerd[1558]: 2025-02-13 19:22:43.034 [INFO][4897] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:43.048293 containerd[1558]: 2025-02-13 19:22:43.034 [INFO][4897] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"c028a42e-4dac-4c57-bfb3-5f11422685ac", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 22, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76", Pod:"coredns-668d6bf9bc-dfpnq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3abe538f69b", MAC:"aa:2d:7f:38:f1:98", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:22:43.048293 containerd[1558]: 2025-02-13 19:22:43.045 [INFO][4897] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76" Namespace="kube-system" Pod="coredns-668d6bf9bc-dfpnq" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--dfpnq-eth0" Feb 13 19:22:43.065176 containerd[1558]: time="2025-02-13T19:22:43.065115067Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:22:43.065354 containerd[1558]: time="2025-02-13T19:22:43.065167451Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:22:43.065354 containerd[1558]: time="2025-02-13T19:22:43.065185984Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:43.065410 containerd[1558]: time="2025-02-13T19:22:43.065254553Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:22:43.078951 systemd[1]: Started cri-containerd-8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76.scope - libcontainer container 8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76. Feb 13 19:22:43.087284 systemd-resolved[1460]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:22:43.110407 containerd[1558]: time="2025-02-13T19:22:43.110033235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-dfpnq,Uid:c028a42e-4dac-4c57-bfb3-5f11422685ac,Namespace:kube-system,Attempt:4,} returns sandbox id \"8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76\"" Feb 13 19:22:43.114376 containerd[1558]: time="2025-02-13T19:22:43.114335932Z" level=info msg="CreateContainer within sandbox \"8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 19:22:43.126916 containerd[1558]: time="2025-02-13T19:22:43.126801853Z" level=info msg="CreateContainer within sandbox \"8449414c73d17e7760762e12fec223d7f52dc3008faf6a5e898657d3d69ddf76\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bdd25a879eba6e21bad120de98a7ca76a9c532bd1de959054865b24633762d45\"" Feb 13 19:22:43.127538 containerd[1558]: time="2025-02-13T19:22:43.127503947Z" level=info msg="StartContainer for \"bdd25a879eba6e21bad120de98a7ca76a9c532bd1de959054865b24633762d45\"" Feb 13 19:22:43.152979 systemd[1]: Started cri-containerd-bdd25a879eba6e21bad120de98a7ca76a9c532bd1de959054865b24633762d45.scope - libcontainer container bdd25a879eba6e21bad120de98a7ca76a9c532bd1de959054865b24633762d45. Feb 13 19:22:43.172988 containerd[1558]: time="2025-02-13T19:22:43.172953691Z" level=info msg="StartContainer for \"bdd25a879eba6e21bad120de98a7ca76a9c532bd1de959054865b24633762d45\" returns successfully" Feb 13 19:22:43.440831 kernel: bpftool[5178]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 19:22:43.606675 systemd-networkd[1459]: vxlan.calico: Link UP Feb 13 19:22:43.606681 systemd-networkd[1459]: vxlan.calico: Gained carrier Feb 13 19:22:43.912116 kubelet[2835]: I0213 19:22:43.911717 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-wvctv" podStartSLOduration=28.911703888 podStartE2EDuration="28.911703888s" podCreationTimestamp="2025-02-13 19:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:22:43.910969734 +0000 UTC m=+36.220170364" watchObservedRunningTime="2025-02-13 19:22:43.911703888 +0000 UTC m=+36.220904525" Feb 13 19:22:43.912116 kubelet[2835]: I0213 19:22:43.911782 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-dfpnq" podStartSLOduration=28.911778237 podStartE2EDuration="28.911778237s" podCreationTimestamp="2025-02-13 19:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:22:43.904967687 +0000 UTC m=+36.214168316" watchObservedRunningTime="2025-02-13 19:22:43.911778237 +0000 UTC m=+36.220978866" Feb 13 19:22:44.056917 systemd-networkd[1459]: cali44d769c1abc: Gained IPv6LL Feb 13 19:22:44.248984 systemd-networkd[1459]: calia70eac9f65d: Gained IPv6LL Feb 13 19:22:44.314027 systemd-networkd[1459]: cali22f6eb1a933: Gained IPv6LL Feb 13 19:22:44.696949 systemd-networkd[1459]: cali3abe538f69b: Gained IPv6LL Feb 13 19:22:44.881841 containerd[1558]: time="2025-02-13T19:22:44.881529818Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:44.882085 containerd[1558]: time="2025-02-13T19:22:44.881920234Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Feb 13 19:22:44.882304 containerd[1558]: time="2025-02-13T19:22:44.882291186Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:44.883524 containerd[1558]: time="2025-02-13T19:22:44.883511162Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:44.883998 containerd[1558]: time="2025-02-13T19:22:44.883986091Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.224553593s" Feb 13 19:22:44.884058 containerd[1558]: time="2025-02-13T19:22:44.884049791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Feb 13 19:22:44.885165 containerd[1558]: time="2025-02-13T19:22:44.885145301Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 19:22:44.889025 systemd-networkd[1459]: calid438bcf2ab0: Gained IPv6LL Feb 13 19:22:44.895979 containerd[1558]: time="2025-02-13T19:22:44.895687861Z" level=info msg="CreateContainer within sandbox \"23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Feb 13 19:22:44.903237 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1633802994.mount: Deactivated successfully. Feb 13 19:22:44.905700 containerd[1558]: time="2025-02-13T19:22:44.905680837Z" level=info msg="CreateContainer within sandbox \"23b706424006a1a79ed1607c5252f5fb7bf63780bd1a7e8ec6786ee5e2b97a0c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9681ea26c408fdfe7f64a06764a364c2409143a12d830625e6f68d228cdfca70\"" Feb 13 19:22:44.906171 containerd[1558]: time="2025-02-13T19:22:44.906146815Z" level=info msg="StartContainer for \"9681ea26c408fdfe7f64a06764a364c2409143a12d830625e6f68d228cdfca70\"" Feb 13 19:22:44.933901 systemd[1]: Started cri-containerd-9681ea26c408fdfe7f64a06764a364c2409143a12d830625e6f68d228cdfca70.scope - libcontainer container 9681ea26c408fdfe7f64a06764a364c2409143a12d830625e6f68d228cdfca70. Feb 13 19:22:44.952933 systemd-networkd[1459]: cali1e1b6a760a5: Gained IPv6LL Feb 13 19:22:44.965244 containerd[1558]: time="2025-02-13T19:22:44.964914845Z" level=info msg="StartContainer for \"9681ea26c408fdfe7f64a06764a364c2409143a12d830625e6f68d228cdfca70\" returns successfully" Feb 13 19:22:45.081058 systemd-networkd[1459]: vxlan.calico: Gained IPv6LL Feb 13 19:22:45.928529 kubelet[2835]: I0213 19:22:45.928190 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-b645f959d-hxt68" podStartSLOduration=23.702501171 podStartE2EDuration="25.928176731s" podCreationTimestamp="2025-02-13 19:22:20 +0000 UTC" firstStartedPulling="2025-02-13 19:22:42.659031118 +0000 UTC m=+34.968231749" lastFinishedPulling="2025-02-13 19:22:44.884706676 +0000 UTC m=+37.193907309" observedRunningTime="2025-02-13 19:22:45.928027706 +0000 UTC m=+38.237228341" watchObservedRunningTime="2025-02-13 19:22:45.928176731 +0000 UTC m=+38.237377362" Feb 13 19:22:46.925226 kubelet[2835]: I0213 19:22:46.924757 2835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:22:47.363246 kubelet[2835]: I0213 19:22:47.362969 2835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:22:47.722775 systemd[1]: run-containerd-runc-k8s.io-5306a1bb9aecf6f51c15945f39d2e22bd63ed8ccfd5086dd475a4963a876cc98-runc.LNtYwI.mount: Deactivated successfully. Feb 13 19:22:47.831342 containerd[1558]: time="2025-02-13T19:22:47.831312893Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:47.832723 containerd[1558]: time="2025-02-13T19:22:47.832698668Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Feb 13 19:22:47.833950 containerd[1558]: time="2025-02-13T19:22:47.833540564Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:47.835108 containerd[1558]: time="2025-02-13T19:22:47.835094446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:47.836129 containerd[1558]: time="2025-02-13T19:22:47.836115817Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.95095209s" Feb 13 19:22:47.836192 containerd[1558]: time="2025-02-13T19:22:47.836182613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 19:22:47.837838 containerd[1558]: time="2025-02-13T19:22:47.837754495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 19:22:47.838192 containerd[1558]: time="2025-02-13T19:22:47.838178950Z" level=info msg="CreateContainer within sandbox \"e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 19:22:47.859322 containerd[1558]: time="2025-02-13T19:22:47.859247642Z" level=info msg="CreateContainer within sandbox \"e53f65997cf88c7e00c4e711fd9107416a08ca1d9737f89cb88ef7ca5cd6a9c1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e2993958cc802cd96b82add456972ce22008afc917c79f02c6b5f4f8afaca05a\"" Feb 13 19:22:47.860079 containerd[1558]: time="2025-02-13T19:22:47.859898636Z" level=info msg="StartContainer for \"e2993958cc802cd96b82add456972ce22008afc917c79f02c6b5f4f8afaca05a\"" Feb 13 19:22:47.894007 systemd[1]: Started cri-containerd-e2993958cc802cd96b82add456972ce22008afc917c79f02c6b5f4f8afaca05a.scope - libcontainer container e2993958cc802cd96b82add456972ce22008afc917c79f02c6b5f4f8afaca05a. Feb 13 19:22:47.932170 containerd[1558]: time="2025-02-13T19:22:47.932141370Z" level=info msg="StartContainer for \"e2993958cc802cd96b82add456972ce22008afc917c79f02c6b5f4f8afaca05a\" returns successfully" Feb 13 19:22:48.346551 kubelet[2835]: I0213 19:22:48.346319 2835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:22:48.376353 containerd[1558]: time="2025-02-13T19:22:48.376330032Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:48.377381 containerd[1558]: time="2025-02-13T19:22:48.377360141Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Feb 13 19:22:48.378626 containerd[1558]: time="2025-02-13T19:22:48.378451042Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 540.661454ms" Feb 13 19:22:48.378626 containerd[1558]: time="2025-02-13T19:22:48.378476242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 19:22:48.379697 containerd[1558]: time="2025-02-13T19:22:48.379654793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 19:22:48.383884 containerd[1558]: time="2025-02-13T19:22:48.383706491Z" level=info msg="CreateContainer within sandbox \"0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 19:22:48.394118 containerd[1558]: time="2025-02-13T19:22:48.394092180Z" level=info msg="CreateContainer within sandbox \"0cd69f255e9654e8ef22fed3e713667a117f71930e555c0e1c0182b780197fe7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"570355a51fa0b7db2e8c4614eeeded7fca3d79817c3c100a1ad69b43da9fdd48\"" Feb 13 19:22:48.394943 containerd[1558]: time="2025-02-13T19:22:48.394743474Z" level=info msg="StartContainer for \"570355a51fa0b7db2e8c4614eeeded7fca3d79817c3c100a1ad69b43da9fdd48\"" Feb 13 19:22:48.432025 systemd[1]: Started cri-containerd-570355a51fa0b7db2e8c4614eeeded7fca3d79817c3c100a1ad69b43da9fdd48.scope - libcontainer container 570355a51fa0b7db2e8c4614eeeded7fca3d79817c3c100a1ad69b43da9fdd48. Feb 13 19:22:48.500777 containerd[1558]: time="2025-02-13T19:22:48.500701470Z" level=info msg="StartContainer for \"570355a51fa0b7db2e8c4614eeeded7fca3d79817c3c100a1ad69b43da9fdd48\" returns successfully" Feb 13 19:22:48.959232 kubelet[2835]: I0213 19:22:48.959197 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64b59d7dcb-r4fxw" podStartSLOduration=23.901513766 podStartE2EDuration="28.959185175s" podCreationTimestamp="2025-02-13 19:22:20 +0000 UTC" firstStartedPulling="2025-02-13 19:22:42.779230998 +0000 UTC m=+35.088431623" lastFinishedPulling="2025-02-13 19:22:47.836902403 +0000 UTC m=+40.146103032" observedRunningTime="2025-02-13 19:22:48.949062394 +0000 UTC m=+41.258263030" watchObservedRunningTime="2025-02-13 19:22:48.959185175 +0000 UTC m=+41.268385807" Feb 13 19:22:48.959522 kubelet[2835]: I0213 19:22:48.959299 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64b59d7dcb-8vjgc" podStartSLOduration=23.507732185 podStartE2EDuration="28.959295214s" podCreationTimestamp="2025-02-13 19:22:20 +0000 UTC" firstStartedPulling="2025-02-13 19:22:42.927657232 +0000 UTC m=+35.236857859" lastFinishedPulling="2025-02-13 19:22:48.379220258 +0000 UTC m=+40.688420888" observedRunningTime="2025-02-13 19:22:48.959176194 +0000 UTC m=+41.268376821" watchObservedRunningTime="2025-02-13 19:22:48.959295214 +0000 UTC m=+41.268495844" Feb 13 19:22:49.945775 kubelet[2835]: I0213 19:22:49.945680 2835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:22:49.945775 kubelet[2835]: I0213 19:22:49.945712 2835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:22:50.610619 containerd[1558]: time="2025-02-13T19:22:50.610191371Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:50.612585 containerd[1558]: time="2025-02-13T19:22:50.611176165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 19:22:50.612585 containerd[1558]: time="2025-02-13T19:22:50.611853134Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:50.614405 containerd[1558]: time="2025-02-13T19:22:50.614165558Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:50.616583 containerd[1558]: time="2025-02-13T19:22:50.615757885Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 2.236085028s" Feb 13 19:22:50.616583 containerd[1558]: time="2025-02-13T19:22:50.615781850Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 19:22:50.621693 containerd[1558]: time="2025-02-13T19:22:50.621628493Z" level=info msg="CreateContainer within sandbox \"7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 19:22:50.649360 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3561147073.mount: Deactivated successfully. Feb 13 19:22:50.657867 containerd[1558]: time="2025-02-13T19:22:50.657832967Z" level=info msg="CreateContainer within sandbox \"7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5334ee62b0753bd4b32046d1a94a10fee9aebf234dba10c03d0f8f2d3252a45c\"" Feb 13 19:22:50.658731 containerd[1558]: time="2025-02-13T19:22:50.658658574Z" level=info msg="StartContainer for \"5334ee62b0753bd4b32046d1a94a10fee9aebf234dba10c03d0f8f2d3252a45c\"" Feb 13 19:22:50.696973 systemd[1]: Started cri-containerd-5334ee62b0753bd4b32046d1a94a10fee9aebf234dba10c03d0f8f2d3252a45c.scope - libcontainer container 5334ee62b0753bd4b32046d1a94a10fee9aebf234dba10c03d0f8f2d3252a45c. Feb 13 19:22:50.724894 containerd[1558]: time="2025-02-13T19:22:50.724858976Z" level=info msg="StartContainer for \"5334ee62b0753bd4b32046d1a94a10fee9aebf234dba10c03d0f8f2d3252a45c\" returns successfully" Feb 13 19:22:50.726145 containerd[1558]: time="2025-02-13T19:22:50.726121678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 19:22:53.122064 containerd[1558]: time="2025-02-13T19:22:53.122031395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:53.132048 containerd[1558]: time="2025-02-13T19:22:53.131989941Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 19:22:53.146583 containerd[1558]: time="2025-02-13T19:22:53.146545117Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:53.164573 containerd[1558]: time="2025-02-13T19:22:53.164536395Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:22:53.169415 containerd[1558]: time="2025-02-13T19:22:53.165119157Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.438974869s" Feb 13 19:22:53.169415 containerd[1558]: time="2025-02-13T19:22:53.165223609Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 19:22:53.169415 containerd[1558]: time="2025-02-13T19:22:53.167180525Z" level=info msg="CreateContainer within sandbox \"7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 19:22:53.277704 containerd[1558]: time="2025-02-13T19:22:53.277622198Z" level=info msg="CreateContainer within sandbox \"7941906e39b5c6d0f440c23903e4fa79d12c98a5f2b0048dc9a21f40f55c56f8\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"47bbbb78549345128e2364b906473ea5ce4f1460686c8c4bf19b387c48feb711\"" Feb 13 19:22:53.278751 containerd[1558]: time="2025-02-13T19:22:53.278120673Z" level=info msg="StartContainer for \"47bbbb78549345128e2364b906473ea5ce4f1460686c8c4bf19b387c48feb711\"" Feb 13 19:22:53.302949 systemd[1]: Started cri-containerd-47bbbb78549345128e2364b906473ea5ce4f1460686c8c4bf19b387c48feb711.scope - libcontainer container 47bbbb78549345128e2364b906473ea5ce4f1460686c8c4bf19b387c48feb711. Feb 13 19:22:53.354835 containerd[1558]: time="2025-02-13T19:22:53.354779324Z" level=info msg="StartContainer for \"47bbbb78549345128e2364b906473ea5ce4f1460686c8c4bf19b387c48feb711\" returns successfully" Feb 13 19:22:53.967041 kubelet[2835]: I0213 19:22:53.966717 2835 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-8xksv" podStartSLOduration=23.839726607 podStartE2EDuration="33.966702539s" podCreationTimestamp="2025-02-13 19:22:20 +0000 UTC" firstStartedPulling="2025-02-13 19:22:43.039073619 +0000 UTC m=+35.348274253" lastFinishedPulling="2025-02-13 19:22:53.166049558 +0000 UTC m=+45.475250185" observedRunningTime="2025-02-13 19:22:53.963659251 +0000 UTC m=+46.272859888" watchObservedRunningTime="2025-02-13 19:22:53.966702539 +0000 UTC m=+46.275903177" Feb 13 19:22:54.394810 kubelet[2835]: I0213 19:22:54.394657 2835 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 19:22:54.399601 kubelet[2835]: I0213 19:22:54.399522 2835 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 19:22:57.032570 kubelet[2835]: I0213 19:22:57.032262 2835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:23:03.412619 kubelet[2835]: I0213 19:23:03.412056 2835 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:23:07.856959 containerd[1558]: time="2025-02-13T19:23:07.856930029Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:23:07.857257 containerd[1558]: time="2025-02-13T19:23:07.857010168Z" level=info msg="TearDown network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" successfully" Feb 13 19:23:07.857257 containerd[1558]: time="2025-02-13T19:23:07.857018244Z" level=info msg="StopPodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" returns successfully" Feb 13 19:23:07.876895 containerd[1558]: time="2025-02-13T19:23:07.876859638Z" level=info msg="RemovePodSandbox for \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:23:07.884976 containerd[1558]: time="2025-02-13T19:23:07.884825399Z" level=info msg="Forcibly stopping sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\"" Feb 13 19:23:07.893098 containerd[1558]: time="2025-02-13T19:23:07.884893321Z" level=info msg="TearDown network for sandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" successfully" Feb 13 19:23:07.896957 containerd[1558]: time="2025-02-13T19:23:07.896898929Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.901038 containerd[1558]: time="2025-02-13T19:23:07.901010853Z" level=info msg="RemovePodSandbox \"4e61280f91db616c866824199f1789abb89b09764c69d61e81f6a771db7ab812\" returns successfully" Feb 13 19:23:07.906639 containerd[1558]: time="2025-02-13T19:23:07.906335468Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" Feb 13 19:23:07.906639 containerd[1558]: time="2025-02-13T19:23:07.906402530Z" level=info msg="TearDown network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" successfully" Feb 13 19:23:07.906639 containerd[1558]: time="2025-02-13T19:23:07.906409974Z" level=info msg="StopPodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" returns successfully" Feb 13 19:23:07.906639 containerd[1558]: time="2025-02-13T19:23:07.906548860Z" level=info msg="RemovePodSandbox for \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" Feb 13 19:23:07.906639 containerd[1558]: time="2025-02-13T19:23:07.906559688Z" level=info msg="Forcibly stopping sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\"" Feb 13 19:23:07.906907 containerd[1558]: time="2025-02-13T19:23:07.906626671Z" level=info msg="TearDown network for sandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" successfully" Feb 13 19:23:07.908313 containerd[1558]: time="2025-02-13T19:23:07.907996747Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.908313 containerd[1558]: time="2025-02-13T19:23:07.908024061Z" level=info msg="RemovePodSandbox \"212497bbd1e86612648c92df73b2ba39b2435f81c8a00b48f367cf39b8a3743f\" returns successfully" Feb 13 19:23:07.908313 containerd[1558]: time="2025-02-13T19:23:07.908145870Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\"" Feb 13 19:23:07.908313 containerd[1558]: time="2025-02-13T19:23:07.908184882Z" level=info msg="TearDown network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" successfully" Feb 13 19:23:07.908313 containerd[1558]: time="2025-02-13T19:23:07.908190889Z" level=info msg="StopPodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" returns successfully" Feb 13 19:23:07.908313 containerd[1558]: time="2025-02-13T19:23:07.908286439Z" level=info msg="RemovePodSandbox for \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\"" Feb 13 19:23:07.908313 containerd[1558]: time="2025-02-13T19:23:07.908296843Z" level=info msg="Forcibly stopping sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\"" Feb 13 19:23:07.908462 containerd[1558]: time="2025-02-13T19:23:07.908325282Z" level=info msg="TearDown network for sandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" successfully" Feb 13 19:23:07.911866 containerd[1558]: time="2025-02-13T19:23:07.911841852Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.912873 containerd[1558]: time="2025-02-13T19:23:07.911876804Z" level=info msg="RemovePodSandbox \"64492b74917f714041fbabc9270dd03cc1c4d21649ef72eb48271ab3d1f6cfe6\" returns successfully" Feb 13 19:23:07.913023 containerd[1558]: time="2025-02-13T19:23:07.912958456Z" level=info msg="StopPodSandbox for \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\"" Feb 13 19:23:07.913023 containerd[1558]: time="2025-02-13T19:23:07.913016117Z" level=info msg="TearDown network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" successfully" Feb 13 19:23:07.913656 containerd[1558]: time="2025-02-13T19:23:07.913024302Z" level=info msg="StopPodSandbox for \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" returns successfully" Feb 13 19:23:07.913656 containerd[1558]: time="2025-02-13T19:23:07.913195084Z" level=info msg="RemovePodSandbox for \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\"" Feb 13 19:23:07.913656 containerd[1558]: time="2025-02-13T19:23:07.913206471Z" level=info msg="Forcibly stopping sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\"" Feb 13 19:23:07.913656 containerd[1558]: time="2025-02-13T19:23:07.913239995Z" level=info msg="TearDown network for sandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" successfully" Feb 13 19:23:07.914903 containerd[1558]: time="2025-02-13T19:23:07.914866208Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.914948 containerd[1558]: time="2025-02-13T19:23:07.914938798Z" level=info msg="RemovePodSandbox \"61325bab233764a44971a4636e373b8c42a21b28383b89fbc9ae3b9b13e094f0\" returns successfully" Feb 13 19:23:07.915333 containerd[1558]: time="2025-02-13T19:23:07.915313196Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" Feb 13 19:23:07.916258 containerd[1558]: time="2025-02-13T19:23:07.916225683Z" level=info msg="TearDown network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" successfully" Feb 13 19:23:07.916258 containerd[1558]: time="2025-02-13T19:23:07.916254578Z" level=info msg="StopPodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" returns successfully" Feb 13 19:23:07.916824 containerd[1558]: time="2025-02-13T19:23:07.916449843Z" level=info msg="RemovePodSandbox for \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" Feb 13 19:23:07.916824 containerd[1558]: time="2025-02-13T19:23:07.916464543Z" level=info msg="Forcibly stopping sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\"" Feb 13 19:23:07.916824 containerd[1558]: time="2025-02-13T19:23:07.916498795Z" level=info msg="TearDown network for sandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" successfully" Feb 13 19:23:07.918339 containerd[1558]: time="2025-02-13T19:23:07.918326176Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.918793 containerd[1558]: time="2025-02-13T19:23:07.918781299Z" level=info msg="RemovePodSandbox \"a7703140d23f7aae97cb15175fe774d6f9e49067db8f842c0243a2e196f5a8c7\" returns successfully" Feb 13 19:23:07.919039 containerd[1558]: time="2025-02-13T19:23:07.919028571Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\"" Feb 13 19:23:07.919229 containerd[1558]: time="2025-02-13T19:23:07.919149624Z" level=info msg="TearDown network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" successfully" Feb 13 19:23:07.919229 containerd[1558]: time="2025-02-13T19:23:07.919157998Z" level=info msg="StopPodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" returns successfully" Feb 13 19:23:07.919838 containerd[1558]: time="2025-02-13T19:23:07.919381870Z" level=info msg="RemovePodSandbox for \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\"" Feb 13 19:23:07.919838 containerd[1558]: time="2025-02-13T19:23:07.919392477Z" level=info msg="Forcibly stopping sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\"" Feb 13 19:23:07.919838 containerd[1558]: time="2025-02-13T19:23:07.919427187Z" level=info msg="TearDown network for sandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" successfully" Feb 13 19:23:07.921072 containerd[1558]: time="2025-02-13T19:23:07.921060168Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.921160 containerd[1558]: time="2025-02-13T19:23:07.921128323Z" level=info msg="RemovePodSandbox \"7cec4dbbd1a5289afdb629e8604d0180633d77718058cc31987763ea258f7241\" returns successfully" Feb 13 19:23:07.921440 containerd[1558]: time="2025-02-13T19:23:07.921366383Z" level=info msg="StopPodSandbox for \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\"" Feb 13 19:23:07.921440 containerd[1558]: time="2025-02-13T19:23:07.921406746Z" level=info msg="TearDown network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" successfully" Feb 13 19:23:07.921440 containerd[1558]: time="2025-02-13T19:23:07.921412667Z" level=info msg="StopPodSandbox for \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" returns successfully" Feb 13 19:23:07.922183 containerd[1558]: time="2025-02-13T19:23:07.921679006Z" level=info msg="RemovePodSandbox for \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\"" Feb 13 19:23:07.922183 containerd[1558]: time="2025-02-13T19:23:07.921690538Z" level=info msg="Forcibly stopping sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\"" Feb 13 19:23:07.922183 containerd[1558]: time="2025-02-13T19:23:07.921721136Z" level=info msg="TearDown network for sandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" successfully" Feb 13 19:23:07.928987 containerd[1558]: time="2025-02-13T19:23:07.928960677Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.929058 containerd[1558]: time="2025-02-13T19:23:07.929013985Z" level=info msg="RemovePodSandbox \"ceb6ded748d360c420e5cc165ed8d498a9f0dba6b6292aa54eda07d2d9a92d06\" returns successfully" Feb 13 19:23:07.929322 containerd[1558]: time="2025-02-13T19:23:07.929229650Z" level=info msg="StopPodSandbox for \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\"" Feb 13 19:23:07.929322 containerd[1558]: time="2025-02-13T19:23:07.929284853Z" level=info msg="TearDown network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\" successfully" Feb 13 19:23:07.929322 containerd[1558]: time="2025-02-13T19:23:07.929291718Z" level=info msg="StopPodSandbox for \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\" returns successfully" Feb 13 19:23:07.929419 containerd[1558]: time="2025-02-13T19:23:07.929403414Z" level=info msg="RemovePodSandbox for \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\"" Feb 13 19:23:07.929419 containerd[1558]: time="2025-02-13T19:23:07.929417222Z" level=info msg="Forcibly stopping sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\"" Feb 13 19:23:07.929471 containerd[1558]: time="2025-02-13T19:23:07.929450162Z" level=info msg="TearDown network for sandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\" successfully" Feb 13 19:23:07.930789 containerd[1558]: time="2025-02-13T19:23:07.930745477Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.930789 containerd[1558]: time="2025-02-13T19:23:07.930766632Z" level=info msg="RemovePodSandbox \"acf6f5e920b498996835740c62be6cb991ae65b08b6b3f61b3d6fa31e018c379\" returns successfully" Feb 13 19:23:07.931508 containerd[1558]: time="2025-02-13T19:23:07.931121619Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:23:07.931508 containerd[1558]: time="2025-02-13T19:23:07.931162810Z" level=info msg="TearDown network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" successfully" Feb 13 19:23:07.931508 containerd[1558]: time="2025-02-13T19:23:07.931169298Z" level=info msg="StopPodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" returns successfully" Feb 13 19:23:07.931508 containerd[1558]: time="2025-02-13T19:23:07.931282231Z" level=info msg="RemovePodSandbox for \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:23:07.931508 containerd[1558]: time="2025-02-13T19:23:07.931297147Z" level=info msg="Forcibly stopping sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\"" Feb 13 19:23:07.931508 containerd[1558]: time="2025-02-13T19:23:07.931328703Z" level=info msg="TearDown network for sandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" successfully" Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.932574867Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.932595360Z" level=info msg="RemovePodSandbox \"b588fda73b6fd54f2c4ea01349b529e3632e39f5e91cd681452135e3394b1cdc\" returns successfully" Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.932759321Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.932855169Z" level=info msg="TearDown network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" successfully" Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.932863022Z" level=info msg="StopPodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" returns successfully" Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.933005337Z" level=info msg="RemovePodSandbox for \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.933015685Z" level=info msg="Forcibly stopping sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\"" Feb 13 19:23:07.933741 containerd[1558]: time="2025-02-13T19:23:07.933057522Z" level=info msg="TearDown network for sandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" successfully" Feb 13 19:23:07.934910 containerd[1558]: time="2025-02-13T19:23:07.934892352Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.935160 containerd[1558]: time="2025-02-13T19:23:07.934925814Z" level=info msg="RemovePodSandbox \"356ffc2b8b6cdaa914db8a358e83684f2333c8f60eb4e7dcba0b3f62f724821b\" returns successfully" Feb 13 19:23:07.935443 containerd[1558]: time="2025-02-13T19:23:07.935376498Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\"" Feb 13 19:23:07.935443 containerd[1558]: time="2025-02-13T19:23:07.935433342Z" level=info msg="TearDown network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" successfully" Feb 13 19:23:07.935443 containerd[1558]: time="2025-02-13T19:23:07.935441732Z" level=info msg="StopPodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" returns successfully" Feb 13 19:23:07.935797 containerd[1558]: time="2025-02-13T19:23:07.935782258Z" level=info msg="RemovePodSandbox for \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\"" Feb 13 19:23:07.935880 containerd[1558]: time="2025-02-13T19:23:07.935797370Z" level=info msg="Forcibly stopping sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\"" Feb 13 19:23:07.936393 containerd[1558]: time="2025-02-13T19:23:07.935910621Z" level=info msg="TearDown network for sandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" successfully" Feb 13 19:23:07.937466 containerd[1558]: time="2025-02-13T19:23:07.937446986Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.937514 containerd[1558]: time="2025-02-13T19:23:07.937480298Z" level=info msg="RemovePodSandbox \"023d751541ab43ea06369295bc7f425bf0a659734be572298a18717260f14efd\" returns successfully" Feb 13 19:23:07.937725 containerd[1558]: time="2025-02-13T19:23:07.937701797Z" level=info msg="StopPodSandbox for \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\"" Feb 13 19:23:07.937759 containerd[1558]: time="2025-02-13T19:23:07.937748754Z" level=info msg="TearDown network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" successfully" Feb 13 19:23:07.937759 containerd[1558]: time="2025-02-13T19:23:07.937754508Z" level=info msg="StopPodSandbox for \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" returns successfully" Feb 13 19:23:07.938133 containerd[1558]: time="2025-02-13T19:23:07.938114785Z" level=info msg="RemovePodSandbox for \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\"" Feb 13 19:23:07.938170 containerd[1558]: time="2025-02-13T19:23:07.938135403Z" level=info msg="Forcibly stopping sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\"" Feb 13 19:23:07.939789 containerd[1558]: time="2025-02-13T19:23:07.938167349Z" level=info msg="TearDown network for sandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" successfully" Feb 13 19:23:07.939789 containerd[1558]: time="2025-02-13T19:23:07.939361089Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.939789 containerd[1558]: time="2025-02-13T19:23:07.939385015Z" level=info msg="RemovePodSandbox \"d2f5c4d19449aa00a7e3448e1bd7c9ad5e57ce2e252999b3594b031a8fc553c1\" returns successfully" Feb 13 19:23:07.939789 containerd[1558]: time="2025-02-13T19:23:07.939531044Z" level=info msg="StopPodSandbox for \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\"" Feb 13 19:23:07.939789 containerd[1558]: time="2025-02-13T19:23:07.939618817Z" level=info msg="TearDown network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\" successfully" Feb 13 19:23:07.939789 containerd[1558]: time="2025-02-13T19:23:07.939625193Z" level=info msg="StopPodSandbox for \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\" returns successfully" Feb 13 19:23:07.939789 containerd[1558]: time="2025-02-13T19:23:07.939783771Z" level=info msg="RemovePodSandbox for \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\"" Feb 13 19:23:07.940726 containerd[1558]: time="2025-02-13T19:23:07.939794934Z" level=info msg="Forcibly stopping sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\"" Feb 13 19:23:07.940726 containerd[1558]: time="2025-02-13T19:23:07.939842096Z" level=info msg="TearDown network for sandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\" successfully" Feb 13 19:23:07.941030 containerd[1558]: time="2025-02-13T19:23:07.941013886Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.941062 containerd[1558]: time="2025-02-13T19:23:07.941037063Z" level=info msg="RemovePodSandbox \"0107b4b9023a74642fcce410de8022c001222dfdef7adefd5f3395131f209c36\" returns successfully" Feb 13 19:23:07.941178 containerd[1558]: time="2025-02-13T19:23:07.941166133Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:23:07.941218 containerd[1558]: time="2025-02-13T19:23:07.941206567Z" level=info msg="TearDown network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" successfully" Feb 13 19:23:07.941218 containerd[1558]: time="2025-02-13T19:23:07.941214007Z" level=info msg="StopPodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" returns successfully" Feb 13 19:23:07.941537 containerd[1558]: time="2025-02-13T19:23:07.941359637Z" level=info msg="RemovePodSandbox for \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:23:07.941537 containerd[1558]: time="2025-02-13T19:23:07.941379824Z" level=info msg="Forcibly stopping sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\"" Feb 13 19:23:07.941537 containerd[1558]: time="2025-02-13T19:23:07.941412672Z" level=info msg="TearDown network for sandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" successfully" Feb 13 19:23:07.942502 containerd[1558]: time="2025-02-13T19:23:07.942484308Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.942532 containerd[1558]: time="2025-02-13T19:23:07.942506276Z" level=info msg="RemovePodSandbox \"9cd7680a4831c5fe58040da95af6f255fe2bcfbbbd7098763f8d49756503a7d5\" returns successfully" Feb 13 19:23:07.942670 containerd[1558]: time="2025-02-13T19:23:07.942658032Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" Feb 13 19:23:07.942707 containerd[1558]: time="2025-02-13T19:23:07.942698032Z" level=info msg="TearDown network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" successfully" Feb 13 19:23:07.942707 containerd[1558]: time="2025-02-13T19:23:07.942705313Z" level=info msg="StopPodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" returns successfully" Feb 13 19:23:07.942829 containerd[1558]: time="2025-02-13T19:23:07.942807453Z" level=info msg="RemovePodSandbox for \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" Feb 13 19:23:07.942829 containerd[1558]: time="2025-02-13T19:23:07.942828107Z" level=info msg="Forcibly stopping sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\"" Feb 13 19:23:07.942873 containerd[1558]: time="2025-02-13T19:23:07.942857514Z" level=info msg="TearDown network for sandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" successfully" Feb 13 19:23:07.943914 containerd[1558]: time="2025-02-13T19:23:07.943899799Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.943944 containerd[1558]: time="2025-02-13T19:23:07.943920279Z" level=info msg="RemovePodSandbox \"2b6db98987475cd18b3fa85ad54b198bf95807e20e827ed52ac429df60782c58\" returns successfully" Feb 13 19:23:07.944083 containerd[1558]: time="2025-02-13T19:23:07.944069519Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\"" Feb 13 19:23:07.944134 containerd[1558]: time="2025-02-13T19:23:07.944121248Z" level=info msg="TearDown network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" successfully" Feb 13 19:23:07.944134 containerd[1558]: time="2025-02-13T19:23:07.944130161Z" level=info msg="StopPodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" returns successfully" Feb 13 19:23:07.944740 containerd[1558]: time="2025-02-13T19:23:07.944727300Z" level=info msg="RemovePodSandbox for \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\"" Feb 13 19:23:07.944771 containerd[1558]: time="2025-02-13T19:23:07.944740252Z" level=info msg="Forcibly stopping sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\"" Feb 13 19:23:07.944795 containerd[1558]: time="2025-02-13T19:23:07.944772221Z" level=info msg="TearDown network for sandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" successfully" Feb 13 19:23:07.951082 containerd[1558]: time="2025-02-13T19:23:07.951066990Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.951189 containerd[1558]: time="2025-02-13T19:23:07.951090060Z" level=info msg="RemovePodSandbox \"33e3811d02c2242b8e7cd2cc1e9b38c506ed95c16e695884b68e1840fec34b02\" returns successfully" Feb 13 19:23:07.951218 containerd[1558]: time="2025-02-13T19:23:07.951211827Z" level=info msg="StopPodSandbox for \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\"" Feb 13 19:23:07.951331 containerd[1558]: time="2025-02-13T19:23:07.951248716Z" level=info msg="TearDown network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" successfully" Feb 13 19:23:07.951331 containerd[1558]: time="2025-02-13T19:23:07.951256287Z" level=info msg="StopPodSandbox for \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" returns successfully" Feb 13 19:23:07.951905 containerd[1558]: time="2025-02-13T19:23:07.951445073Z" level=info msg="RemovePodSandbox for \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\"" Feb 13 19:23:07.951905 containerd[1558]: time="2025-02-13T19:23:07.951459412Z" level=info msg="Forcibly stopping sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\"" Feb 13 19:23:07.951905 containerd[1558]: time="2025-02-13T19:23:07.951491918Z" level=info msg="TearDown network for sandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" successfully" Feb 13 19:23:07.952898 containerd[1558]: time="2025-02-13T19:23:07.952881482Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.952936 containerd[1558]: time="2025-02-13T19:23:07.952906024Z" level=info msg="RemovePodSandbox \"274188b404d2f44600927b66baf985fb41304c0c9b0623c435ad6e6a83916777\" returns successfully" Feb 13 19:23:07.953092 containerd[1558]: time="2025-02-13T19:23:07.953082958Z" level=info msg="StopPodSandbox for \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\"" Feb 13 19:23:07.953208 containerd[1558]: time="2025-02-13T19:23:07.953168549Z" level=info msg="TearDown network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\" successfully" Feb 13 19:23:07.953208 containerd[1558]: time="2025-02-13T19:23:07.953176829Z" level=info msg="StopPodSandbox for \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\" returns successfully" Feb 13 19:23:07.953809 containerd[1558]: time="2025-02-13T19:23:07.953346959Z" level=info msg="RemovePodSandbox for \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\"" Feb 13 19:23:07.953809 containerd[1558]: time="2025-02-13T19:23:07.953375291Z" level=info msg="Forcibly stopping sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\"" Feb 13 19:23:07.953809 containerd[1558]: time="2025-02-13T19:23:07.953408532Z" level=info msg="TearDown network for sandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\" successfully" Feb 13 19:23:07.954487 containerd[1558]: time="2025-02-13T19:23:07.954472635Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.954518 containerd[1558]: time="2025-02-13T19:23:07.954494216Z" level=info msg="RemovePodSandbox \"f41e955e7f78484d3200a5a8bb9f33cbb8a47e7706ba9acbbb09520738e98513\" returns successfully" Feb 13 19:23:07.954765 containerd[1558]: time="2025-02-13T19:23:07.954632343Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:23:07.954765 containerd[1558]: time="2025-02-13T19:23:07.954671979Z" level=info msg="TearDown network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" successfully" Feb 13 19:23:07.954765 containerd[1558]: time="2025-02-13T19:23:07.954678119Z" level=info msg="StopPodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" returns successfully" Feb 13 19:23:07.958468 containerd[1558]: time="2025-02-13T19:23:07.958448344Z" level=info msg="RemovePodSandbox for \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:23:07.958468 containerd[1558]: time="2025-02-13T19:23:07.958462731Z" level=info msg="Forcibly stopping sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\"" Feb 13 19:23:07.958558 containerd[1558]: time="2025-02-13T19:23:07.958498522Z" level=info msg="TearDown network for sandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" successfully" Feb 13 19:23:07.959593 containerd[1558]: time="2025-02-13T19:23:07.959577700Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.959621 containerd[1558]: time="2025-02-13T19:23:07.959601150Z" level=info msg="RemovePodSandbox \"eac83e8916392e6baee9b55dd65ed62f1ae65f91e071e104fb8067be58f60518\" returns successfully" Feb 13 19:23:07.959765 containerd[1558]: time="2025-02-13T19:23:07.959755472Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" Feb 13 19:23:07.959949 containerd[1558]: time="2025-02-13T19:23:07.959909416Z" level=info msg="TearDown network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" successfully" Feb 13 19:23:07.959949 containerd[1558]: time="2025-02-13T19:23:07.959919501Z" level=info msg="StopPodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" returns successfully" Feb 13 19:23:07.961095 containerd[1558]: time="2025-02-13T19:23:07.960241755Z" level=info msg="RemovePodSandbox for \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" Feb 13 19:23:07.961095 containerd[1558]: time="2025-02-13T19:23:07.960258664Z" level=info msg="Forcibly stopping sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\"" Feb 13 19:23:07.961095 containerd[1558]: time="2025-02-13T19:23:07.960293487Z" level=info msg="TearDown network for sandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" successfully" Feb 13 19:23:07.961621 containerd[1558]: time="2025-02-13T19:23:07.961606798Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.961653 containerd[1558]: time="2025-02-13T19:23:07.961630388Z" level=info msg="RemovePodSandbox \"2cd256b5c8439734409832c78cddb54b1a7840b31764f80b717524709ac39176\" returns successfully" Feb 13 19:23:07.961759 containerd[1558]: time="2025-02-13T19:23:07.961749112Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\"" Feb 13 19:23:07.961890 containerd[1558]: time="2025-02-13T19:23:07.961858943Z" level=info msg="TearDown network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" successfully" Feb 13 19:23:07.961890 containerd[1558]: time="2025-02-13T19:23:07.961868234Z" level=info msg="StopPodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" returns successfully" Feb 13 19:23:07.962088 containerd[1558]: time="2025-02-13T19:23:07.962073365Z" level=info msg="RemovePodSandbox for \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\"" Feb 13 19:23:07.962142 containerd[1558]: time="2025-02-13T19:23:07.962088035Z" level=info msg="Forcibly stopping sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\"" Feb 13 19:23:07.962186 containerd[1558]: time="2025-02-13T19:23:07.962154385Z" level=info msg="TearDown network for sandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" successfully" Feb 13 19:23:07.963250 containerd[1558]: time="2025-02-13T19:23:07.963234861Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.963276 containerd[1558]: time="2025-02-13T19:23:07.963256106Z" level=info msg="RemovePodSandbox \"ecfbd47518f1c4cd6b0d5a691097ba2dc7686664ea13b962faf63ec839ab8b82\" returns successfully" Feb 13 19:23:07.963643 containerd[1558]: time="2025-02-13T19:23:07.963472129Z" level=info msg="StopPodSandbox for \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\"" Feb 13 19:23:07.963643 containerd[1558]: time="2025-02-13T19:23:07.963524802Z" level=info msg="TearDown network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" successfully" Feb 13 19:23:07.963643 containerd[1558]: time="2025-02-13T19:23:07.963531268Z" level=info msg="StopPodSandbox for \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" returns successfully" Feb 13 19:23:07.964966 containerd[1558]: time="2025-02-13T19:23:07.963901967Z" level=info msg="RemovePodSandbox for \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\"" Feb 13 19:23:07.964966 containerd[1558]: time="2025-02-13T19:23:07.963913595Z" level=info msg="Forcibly stopping sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\"" Feb 13 19:23:07.964966 containerd[1558]: time="2025-02-13T19:23:07.963945174Z" level=info msg="TearDown network for sandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" successfully" Feb 13 19:23:07.965693 containerd[1558]: time="2025-02-13T19:23:07.965675183Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.965837 containerd[1558]: time="2025-02-13T19:23:07.965825808Z" level=info msg="RemovePodSandbox \"ba578981e2a29ab3557229612a443c786e9f4e4021a459771649a018afc5e829\" returns successfully" Feb 13 19:23:07.966100 containerd[1558]: time="2025-02-13T19:23:07.966084426Z" level=info msg="StopPodSandbox for \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\"" Feb 13 19:23:07.966144 containerd[1558]: time="2025-02-13T19:23:07.966132414Z" level=info msg="TearDown network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\" successfully" Feb 13 19:23:07.966144 containerd[1558]: time="2025-02-13T19:23:07.966141638Z" level=info msg="StopPodSandbox for \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\" returns successfully" Feb 13 19:23:07.966329 containerd[1558]: time="2025-02-13T19:23:07.966294102Z" level=info msg="RemovePodSandbox for \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\"" Feb 13 19:23:07.966329 containerd[1558]: time="2025-02-13T19:23:07.966306576Z" level=info msg="Forcibly stopping sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\"" Feb 13 19:23:07.966922 containerd[1558]: time="2025-02-13T19:23:07.966446118Z" level=info msg="TearDown network for sandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\" successfully" Feb 13 19:23:07.967611 containerd[1558]: time="2025-02-13T19:23:07.967593445Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.967670 containerd[1558]: time="2025-02-13T19:23:07.967622850Z" level=info msg="RemovePodSandbox \"72097b6c6cdec49029ec88ee5d6cc304660221e67654f6e83f6e13c2474e3b47\" returns successfully" Feb 13 19:23:07.967917 containerd[1558]: time="2025-02-13T19:23:07.967839226Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:23:07.967917 containerd[1558]: time="2025-02-13T19:23:07.967883369Z" level=info msg="TearDown network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" successfully" Feb 13 19:23:07.967917 containerd[1558]: time="2025-02-13T19:23:07.967889731Z" level=info msg="StopPodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" returns successfully" Feb 13 19:23:07.971230 containerd[1558]: time="2025-02-13T19:23:07.971132686Z" level=info msg="RemovePodSandbox for \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:23:07.971230 containerd[1558]: time="2025-02-13T19:23:07.971146555Z" level=info msg="Forcibly stopping sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\"" Feb 13 19:23:07.971230 containerd[1558]: time="2025-02-13T19:23:07.971183937Z" level=info msg="TearDown network for sandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" successfully" Feb 13 19:23:07.972389 containerd[1558]: time="2025-02-13T19:23:07.972373226Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.972421 containerd[1558]: time="2025-02-13T19:23:07.972398960Z" level=info msg="RemovePodSandbox \"9d7869302fd217308f176d5c727e5a14a4ebe76ea1a613b9eb69bba677a6fde1\" returns successfully" Feb 13 19:23:07.972578 containerd[1558]: time="2025-02-13T19:23:07.972565420Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" Feb 13 19:23:07.972617 containerd[1558]: time="2025-02-13T19:23:07.972606276Z" level=info msg="TearDown network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" successfully" Feb 13 19:23:07.972617 containerd[1558]: time="2025-02-13T19:23:07.972615281Z" level=info msg="StopPodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" returns successfully" Feb 13 19:23:07.972796 containerd[1558]: time="2025-02-13T19:23:07.972784900Z" level=info msg="RemovePodSandbox for \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" Feb 13 19:23:07.972831 containerd[1558]: time="2025-02-13T19:23:07.972804519Z" level=info msg="Forcibly stopping sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\"" Feb 13 19:23:07.972872 containerd[1558]: time="2025-02-13T19:23:07.972850448Z" level=info msg="TearDown network for sandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" successfully" Feb 13 19:23:07.975853 containerd[1558]: time="2025-02-13T19:23:07.974701299Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.975853 containerd[1558]: time="2025-02-13T19:23:07.974742193Z" level=info msg="RemovePodSandbox \"8e2b75c9d0cfe4fc678bdc8eff6b27110d0719a6f6fda90964ac6c38be894ca0\" returns successfully" Feb 13 19:23:07.976298 containerd[1558]: time="2025-02-13T19:23:07.976281613Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\"" Feb 13 19:23:07.976490 containerd[1558]: time="2025-02-13T19:23:07.976338493Z" level=info msg="TearDown network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" successfully" Feb 13 19:23:07.976490 containerd[1558]: time="2025-02-13T19:23:07.976347930Z" level=info msg="StopPodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" returns successfully" Feb 13 19:23:07.976490 containerd[1558]: time="2025-02-13T19:23:07.976476207Z" level=info msg="RemovePodSandbox for \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\"" Feb 13 19:23:07.976553 containerd[1558]: time="2025-02-13T19:23:07.976486399Z" level=info msg="Forcibly stopping sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\"" Feb 13 19:23:07.976553 containerd[1558]: time="2025-02-13T19:23:07.976536392Z" level=info msg="TearDown network for sandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" successfully" Feb 13 19:23:07.978322 containerd[1558]: time="2025-02-13T19:23:07.978305825Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.978363 containerd[1558]: time="2025-02-13T19:23:07.978328772Z" level=info msg="RemovePodSandbox \"b1790265b538965cc7e6e0ee45127d2653c1f47e20b607059818f4c181ec44db\" returns successfully" Feb 13 19:23:07.978526 containerd[1558]: time="2025-02-13T19:23:07.978511871Z" level=info msg="StopPodSandbox for \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\"" Feb 13 19:23:07.978595 containerd[1558]: time="2025-02-13T19:23:07.978564905Z" level=info msg="TearDown network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" successfully" Feb 13 19:23:07.978595 containerd[1558]: time="2025-02-13T19:23:07.978592020Z" level=info msg="StopPodSandbox for \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" returns successfully" Feb 13 19:23:07.978735 containerd[1558]: time="2025-02-13T19:23:07.978721396Z" level=info msg="RemovePodSandbox for \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\"" Feb 13 19:23:07.979469 containerd[1558]: time="2025-02-13T19:23:07.978734762Z" level=info msg="Forcibly stopping sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\"" Feb 13 19:23:07.991321 containerd[1558]: time="2025-02-13T19:23:07.991274815Z" level=info msg="TearDown network for sandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" successfully" Feb 13 19:23:07.992559 containerd[1558]: time="2025-02-13T19:23:07.992542619Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.992607 containerd[1558]: time="2025-02-13T19:23:07.992565920Z" level=info msg="RemovePodSandbox \"9924c525acf1ef4fdc7bd8cdaf19223e724462a9db5a54cbfd87f111d8845549\" returns successfully" Feb 13 19:23:07.992940 containerd[1558]: time="2025-02-13T19:23:07.992860715Z" level=info msg="StopPodSandbox for \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\"" Feb 13 19:23:07.992940 containerd[1558]: time="2025-02-13T19:23:07.992903615Z" level=info msg="TearDown network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\" successfully" Feb 13 19:23:07.992940 containerd[1558]: time="2025-02-13T19:23:07.992910516Z" level=info msg="StopPodSandbox for \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\" returns successfully" Feb 13 19:23:07.993533 containerd[1558]: time="2025-02-13T19:23:07.993137933Z" level=info msg="RemovePodSandbox for \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\"" Feb 13 19:23:07.993533 containerd[1558]: time="2025-02-13T19:23:07.993151965Z" level=info msg="Forcibly stopping sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\"" Feb 13 19:23:07.993533 containerd[1558]: time="2025-02-13T19:23:07.993194787Z" level=info msg="TearDown network for sandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\" successfully" Feb 13 19:23:07.994317 containerd[1558]: time="2025-02-13T19:23:07.994301566Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:23:07.994342 containerd[1558]: time="2025-02-13T19:23:07.994323272Z" level=info msg="RemovePodSandbox \"e79f155ebebc4825d8702bdc3aec1fe3d90a3fd3f8a76eaa570a3c907655c271\" returns successfully" Feb 13 19:23:22.963949 systemd[1]: Started sshd@7-139.178.70.106:22-139.178.89.65:39410.service - OpenSSH per-connection server daemon (139.178.89.65:39410). Feb 13 19:23:23.048082 sshd[5688]: Accepted publickey for core from 139.178.89.65 port 39410 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:23.051400 sshd-session[5688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:23.058056 systemd-logind[1542]: New session 10 of user core. Feb 13 19:23:23.060991 systemd[1]: Started session-10.scope - Session 10 of User core. Feb 13 19:23:23.571080 sshd[5690]: Connection closed by 139.178.89.65 port 39410 Feb 13 19:23:23.571565 sshd-session[5688]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:23.573493 systemd-logind[1542]: Session 10 logged out. Waiting for processes to exit. Feb 13 19:23:23.573609 systemd[1]: sshd@7-139.178.70.106:22-139.178.89.65:39410.service: Deactivated successfully. Feb 13 19:23:23.574683 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 19:23:23.575721 systemd-logind[1542]: Removed session 10. Feb 13 19:23:28.587055 systemd[1]: Started sshd@8-139.178.70.106:22-139.178.89.65:39522.service - OpenSSH per-connection server daemon (139.178.89.65:39522). Feb 13 19:23:28.637441 sshd[5713]: Accepted publickey for core from 139.178.89.65 port 39522 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:28.638202 sshd-session[5713]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:28.641414 systemd-logind[1542]: New session 11 of user core. Feb 13 19:23:28.646901 systemd[1]: Started session-11.scope - Session 11 of User core. Feb 13 19:23:28.755489 sshd[5715]: Connection closed by 139.178.89.65 port 39522 Feb 13 19:23:28.756134 sshd-session[5713]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:28.757795 systemd-logind[1542]: Session 11 logged out. Waiting for processes to exit. Feb 13 19:23:28.757919 systemd[1]: sshd@8-139.178.70.106:22-139.178.89.65:39522.service: Deactivated successfully. Feb 13 19:23:28.759012 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 19:23:28.759898 systemd-logind[1542]: Removed session 11. Feb 13 19:23:33.767702 systemd[1]: Started sshd@9-139.178.70.106:22-139.178.89.65:39532.service - OpenSSH per-connection server daemon (139.178.89.65:39532). Feb 13 19:23:33.818831 sshd[5727]: Accepted publickey for core from 139.178.89.65 port 39532 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:33.819759 sshd-session[5727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:33.823181 systemd-logind[1542]: New session 12 of user core. Feb 13 19:23:33.827958 systemd[1]: Started session-12.scope - Session 12 of User core. Feb 13 19:23:33.921151 sshd[5729]: Connection closed by 139.178.89.65 port 39532 Feb 13 19:23:33.922338 sshd-session[5727]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:33.929901 systemd[1]: sshd@9-139.178.70.106:22-139.178.89.65:39532.service: Deactivated successfully. Feb 13 19:23:33.931187 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 19:23:33.931928 systemd-logind[1542]: Session 12 logged out. Waiting for processes to exit. Feb 13 19:23:33.938054 systemd[1]: Started sshd@10-139.178.70.106:22-139.178.89.65:39538.service - OpenSSH per-connection server daemon (139.178.89.65:39538). Feb 13 19:23:33.939464 systemd-logind[1542]: Removed session 12. Feb 13 19:23:33.965917 sshd[5741]: Accepted publickey for core from 139.178.89.65 port 39538 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:33.966614 sshd-session[5741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:33.970199 systemd-logind[1542]: New session 13 of user core. Feb 13 19:23:33.973030 systemd[1]: Started session-13.scope - Session 13 of User core. Feb 13 19:23:34.095809 sshd[5744]: Connection closed by 139.178.89.65 port 39538 Feb 13 19:23:34.096608 sshd-session[5741]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:34.106383 systemd[1]: sshd@10-139.178.70.106:22-139.178.89.65:39538.service: Deactivated successfully. Feb 13 19:23:34.107660 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 19:23:34.110275 systemd-logind[1542]: Session 13 logged out. Waiting for processes to exit. Feb 13 19:23:34.118770 systemd[1]: Started sshd@11-139.178.70.106:22-139.178.89.65:39552.service - OpenSSH per-connection server daemon (139.178.89.65:39552). Feb 13 19:23:34.120561 systemd-logind[1542]: Removed session 13. Feb 13 19:23:34.185669 sshd[5752]: Accepted publickey for core from 139.178.89.65 port 39552 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:34.186502 sshd-session[5752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:34.189555 systemd-logind[1542]: New session 14 of user core. Feb 13 19:23:34.195902 systemd[1]: Started session-14.scope - Session 14 of User core. Feb 13 19:23:34.296888 sshd[5755]: Connection closed by 139.178.89.65 port 39552 Feb 13 19:23:34.297217 sshd-session[5752]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:34.299176 systemd[1]: sshd@11-139.178.70.106:22-139.178.89.65:39552.service: Deactivated successfully. Feb 13 19:23:34.300279 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 19:23:34.300676 systemd-logind[1542]: Session 14 logged out. Waiting for processes to exit. Feb 13 19:23:34.301520 systemd-logind[1542]: Removed session 14. Feb 13 19:23:39.307890 systemd[1]: Started sshd@12-139.178.70.106:22-139.178.89.65:48448.service - OpenSSH per-connection server daemon (139.178.89.65:48448). Feb 13 19:23:39.340566 sshd[5768]: Accepted publickey for core from 139.178.89.65 port 48448 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:39.341653 sshd-session[5768]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:39.345412 systemd-logind[1542]: New session 15 of user core. Feb 13 19:23:39.351999 systemd[1]: Started session-15.scope - Session 15 of User core. Feb 13 19:23:39.445844 sshd[5770]: Connection closed by 139.178.89.65 port 48448 Feb 13 19:23:39.445720 sshd-session[5768]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:39.448269 systemd[1]: sshd@12-139.178.70.106:22-139.178.89.65:48448.service: Deactivated successfully. Feb 13 19:23:39.449529 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 19:23:39.450121 systemd-logind[1542]: Session 15 logged out. Waiting for processes to exit. Feb 13 19:23:39.450648 systemd-logind[1542]: Removed session 15. Feb 13 19:23:44.457653 systemd[1]: Started sshd@13-139.178.70.106:22-139.178.89.65:60444.service - OpenSSH per-connection server daemon (139.178.89.65:60444). Feb 13 19:23:44.519205 sshd[5786]: Accepted publickey for core from 139.178.89.65 port 60444 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:44.520360 sshd-session[5786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:44.524920 systemd-logind[1542]: New session 16 of user core. Feb 13 19:23:44.528962 systemd[1]: Started session-16.scope - Session 16 of User core. Feb 13 19:23:44.775690 sshd[5788]: Connection closed by 139.178.89.65 port 60444 Feb 13 19:23:44.776156 sshd-session[5786]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:44.784630 systemd[1]: sshd@13-139.178.70.106:22-139.178.89.65:60444.service: Deactivated successfully. Feb 13 19:23:44.785715 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 19:23:44.786173 systemd-logind[1542]: Session 16 logged out. Waiting for processes to exit. Feb 13 19:23:44.786764 systemd-logind[1542]: Removed session 16. Feb 13 19:23:49.790505 systemd[1]: Started sshd@14-139.178.70.106:22-139.178.89.65:60450.service - OpenSSH per-connection server daemon (139.178.89.65:60450). Feb 13 19:23:49.995830 sshd[5842]: Accepted publickey for core from 139.178.89.65 port 60450 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:49.997245 sshd-session[5842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:50.000098 systemd-logind[1542]: New session 17 of user core. Feb 13 19:23:50.006901 systemd[1]: Started session-17.scope - Session 17 of User core. Feb 13 19:23:50.252105 sshd[5844]: Connection closed by 139.178.89.65 port 60450 Feb 13 19:23:50.252513 sshd-session[5842]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:50.259238 systemd[1]: Started sshd@15-139.178.70.106:22-139.178.89.65:60464.service - OpenSSH per-connection server daemon (139.178.89.65:60464). Feb 13 19:23:50.262494 systemd[1]: sshd@14-139.178.70.106:22-139.178.89.65:60450.service: Deactivated successfully. Feb 13 19:23:50.264487 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 19:23:50.267171 systemd-logind[1542]: Session 17 logged out. Waiting for processes to exit. Feb 13 19:23:50.268418 systemd-logind[1542]: Removed session 17. Feb 13 19:23:50.308383 sshd[5853]: Accepted publickey for core from 139.178.89.65 port 60464 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:50.309481 sshd-session[5853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:50.313858 systemd-logind[1542]: New session 18 of user core. Feb 13 19:23:50.321920 systemd[1]: Started session-18.scope - Session 18 of User core. Feb 13 19:23:50.664957 sshd[5858]: Connection closed by 139.178.89.65 port 60464 Feb 13 19:23:50.665199 sshd-session[5853]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:50.673364 systemd[1]: sshd@15-139.178.70.106:22-139.178.89.65:60464.service: Deactivated successfully. Feb 13 19:23:50.674432 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 19:23:50.675455 systemd-logind[1542]: Session 18 logged out. Waiting for processes to exit. Feb 13 19:23:50.678171 systemd[1]: Started sshd@16-139.178.70.106:22-139.178.89.65:60472.service - OpenSSH per-connection server daemon (139.178.89.65:60472). Feb 13 19:23:50.679629 systemd-logind[1542]: Removed session 18. Feb 13 19:23:50.724555 sshd[5868]: Accepted publickey for core from 139.178.89.65 port 60472 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:50.725357 sshd-session[5868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:50.727909 systemd-logind[1542]: New session 19 of user core. Feb 13 19:23:50.733911 systemd[1]: Started session-19.scope - Session 19 of User core. Feb 13 19:23:51.613834 sshd[5871]: Connection closed by 139.178.89.65 port 60472 Feb 13 19:23:51.614182 sshd-session[5868]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:51.625325 systemd[1]: sshd@16-139.178.70.106:22-139.178.89.65:60472.service: Deactivated successfully. Feb 13 19:23:51.627808 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 19:23:51.629351 systemd-logind[1542]: Session 19 logged out. Waiting for processes to exit. Feb 13 19:23:51.636091 systemd[1]: Started sshd@17-139.178.70.106:22-139.178.89.65:60480.service - OpenSSH per-connection server daemon (139.178.89.65:60480). Feb 13 19:23:51.637379 systemd-logind[1542]: Removed session 19. Feb 13 19:23:51.707779 sshd[5887]: Accepted publickey for core from 139.178.89.65 port 60480 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:51.708567 sshd-session[5887]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:51.711679 systemd-logind[1542]: New session 20 of user core. Feb 13 19:23:51.714893 systemd[1]: Started session-20.scope - Session 20 of User core. Feb 13 19:23:51.906067 sshd[5890]: Connection closed by 139.178.89.65 port 60480 Feb 13 19:23:51.906103 sshd-session[5887]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:51.923214 systemd[1]: Started sshd@18-139.178.70.106:22-139.178.89.65:60486.service - OpenSSH per-connection server daemon (139.178.89.65:60486). Feb 13 19:23:51.924313 systemd[1]: sshd@17-139.178.70.106:22-139.178.89.65:60480.service: Deactivated successfully. Feb 13 19:23:51.925730 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 19:23:51.926662 systemd-logind[1542]: Session 20 logged out. Waiting for processes to exit. Feb 13 19:23:51.927358 systemd-logind[1542]: Removed session 20. Feb 13 19:23:52.005516 sshd[5897]: Accepted publickey for core from 139.178.89.65 port 60486 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:52.006923 sshd-session[5897]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:52.011445 systemd-logind[1542]: New session 21 of user core. Feb 13 19:23:52.023987 systemd[1]: Started session-21.scope - Session 21 of User core. Feb 13 19:23:52.127366 sshd[5902]: Connection closed by 139.178.89.65 port 60486 Feb 13 19:23:52.127719 sshd-session[5897]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:52.129460 systemd-logind[1542]: Session 21 logged out. Waiting for processes to exit. Feb 13 19:23:52.130021 systemd[1]: sshd@18-139.178.70.106:22-139.178.89.65:60486.service: Deactivated successfully. Feb 13 19:23:52.131255 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 19:23:52.132248 systemd-logind[1542]: Removed session 21. Feb 13 19:23:57.138251 systemd[1]: Started sshd@19-139.178.70.106:22-139.178.89.65:52106.service - OpenSSH per-connection server daemon (139.178.89.65:52106). Feb 13 19:23:57.353455 sshd[5916]: Accepted publickey for core from 139.178.89.65 port 52106 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:23:57.354971 sshd-session[5916]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:23:57.358373 systemd-logind[1542]: New session 22 of user core. Feb 13 19:23:57.364094 systemd[1]: Started session-22.scope - Session 22 of User core. Feb 13 19:23:57.571680 sshd[5918]: Connection closed by 139.178.89.65 port 52106 Feb 13 19:23:57.571990 sshd-session[5916]: pam_unix(sshd:session): session closed for user core Feb 13 19:23:57.575909 systemd[1]: sshd@19-139.178.70.106:22-139.178.89.65:52106.service: Deactivated successfully. Feb 13 19:23:57.577387 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 19:23:57.578518 systemd-logind[1542]: Session 22 logged out. Waiting for processes to exit. Feb 13 19:23:57.579175 systemd-logind[1542]: Removed session 22. Feb 13 19:24:02.586002 systemd[1]: Started sshd@20-139.178.70.106:22-139.178.89.65:52114.service - OpenSSH per-connection server daemon (139.178.89.65:52114). Feb 13 19:24:02.793791 sshd[5929]: Accepted publickey for core from 139.178.89.65 port 52114 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:24:02.794850 sshd-session[5929]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:24:02.798363 systemd-logind[1542]: New session 23 of user core. Feb 13 19:24:02.805954 systemd[1]: Started session-23.scope - Session 23 of User core. Feb 13 19:24:02.963939 sshd[5931]: Connection closed by 139.178.89.65 port 52114 Feb 13 19:24:02.964455 sshd-session[5929]: pam_unix(sshd:session): session closed for user core Feb 13 19:24:02.966117 systemd-logind[1542]: Session 23 logged out. Waiting for processes to exit. Feb 13 19:24:02.966282 systemd[1]: sshd@20-139.178.70.106:22-139.178.89.65:52114.service: Deactivated successfully. Feb 13 19:24:02.968062 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 19:24:02.968997 systemd-logind[1542]: Removed session 23. Feb 13 19:24:07.977441 systemd[1]: Started sshd@21-139.178.70.106:22-139.178.89.65:52644.service - OpenSSH per-connection server daemon (139.178.89.65:52644). Feb 13 19:24:08.083539 sshd[5975]: Accepted publickey for core from 139.178.89.65 port 52644 ssh2: RSA SHA256:NL/G37P9/eR99zDJKW+V9taUH0wkZ8ddZwzfBGT7QcM Feb 13 19:24:08.084504 sshd-session[5975]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:24:08.087714 systemd-logind[1542]: New session 24 of user core. Feb 13 19:24:08.093031 systemd[1]: Started session-24.scope - Session 24 of User core. Feb 13 19:24:08.306388 sshd[5977]: Connection closed by 139.178.89.65 port 52644 Feb 13 19:24:08.307238 sshd-session[5975]: pam_unix(sshd:session): session closed for user core Feb 13 19:24:08.309545 systemd[1]: sshd@21-139.178.70.106:22-139.178.89.65:52644.service: Deactivated successfully. Feb 13 19:24:08.310824 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 19:24:08.311903 systemd-logind[1542]: Session 24 logged out. Waiting for processes to exit. Feb 13 19:24:08.312556 systemd-logind[1542]: Removed session 24.