Feb 13 15:59:47.745185 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 14:00:20 -00 2025 Feb 13 15:59:47.745203 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 15:59:47.745209 kernel: Disabled fast string operations Feb 13 15:59:47.745214 kernel: BIOS-provided physical RAM map: Feb 13 15:59:47.745217 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Feb 13 15:59:47.745222 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Feb 13 15:59:47.745228 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Feb 13 15:59:47.745232 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Feb 13 15:59:47.745236 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Feb 13 15:59:47.745241 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Feb 13 15:59:47.745245 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Feb 13 15:59:47.745257 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Feb 13 15:59:47.745262 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Feb 13 15:59:47.745266 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 15:59:47.745281 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Feb 13 15:59:47.745286 kernel: NX (Execute Disable) protection: active Feb 13 15:59:47.745291 kernel: APIC: Static calls initialized Feb 13 15:59:47.745296 kernel: SMBIOS 2.7 present. Feb 13 15:59:47.745301 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Feb 13 15:59:47.745306 kernel: vmware: hypercall mode: 0x00 Feb 13 15:59:47.745311 kernel: Hypervisor detected: VMware Feb 13 15:59:47.745316 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Feb 13 15:59:47.745322 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Feb 13 15:59:47.745327 kernel: vmware: using clock offset of 3157988062 ns Feb 13 15:59:47.745332 kernel: tsc: Detected 3408.000 MHz processor Feb 13 15:59:47.745337 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 15:59:47.745342 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 15:59:47.745348 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Feb 13 15:59:47.745352 kernel: total RAM covered: 3072M Feb 13 15:59:47.745357 kernel: Found optimal setting for mtrr clean up Feb 13 15:59:47.745363 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Feb 13 15:59:47.745368 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Feb 13 15:59:47.745375 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 15:59:47.745379 kernel: Using GB pages for direct mapping Feb 13 15:59:47.745384 kernel: ACPI: Early table checksum verification disabled Feb 13 15:59:47.745389 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Feb 13 15:59:47.745394 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Feb 13 15:59:47.745399 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Feb 13 15:59:47.745404 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Feb 13 15:59:47.745409 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 15:59:47.745417 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 15:59:47.745422 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Feb 13 15:59:47.745428 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Feb 13 15:59:47.745433 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Feb 13 15:59:47.745438 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Feb 13 15:59:47.745444 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Feb 13 15:59:47.745450 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Feb 13 15:59:47.745455 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Feb 13 15:59:47.745461 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Feb 13 15:59:47.745466 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 15:59:47.745471 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 15:59:47.745476 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Feb 13 15:59:47.745482 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Feb 13 15:59:47.745487 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Feb 13 15:59:47.745492 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Feb 13 15:59:47.745498 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Feb 13 15:59:47.745503 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Feb 13 15:59:47.745508 kernel: system APIC only can use physical flat Feb 13 15:59:47.745514 kernel: APIC: Switched APIC routing to: physical flat Feb 13 15:59:47.745519 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 15:59:47.746315 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Feb 13 15:59:47.746325 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Feb 13 15:59:47.746330 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Feb 13 15:59:47.746335 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Feb 13 15:59:47.746340 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Feb 13 15:59:47.746349 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Feb 13 15:59:47.746354 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Feb 13 15:59:47.746359 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Feb 13 15:59:47.746365 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Feb 13 15:59:47.746372 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Feb 13 15:59:47.746377 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Feb 13 15:59:47.746382 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Feb 13 15:59:47.746388 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Feb 13 15:59:47.746394 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Feb 13 15:59:47.746399 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Feb 13 15:59:47.746405 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Feb 13 15:59:47.746410 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Feb 13 15:59:47.746416 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Feb 13 15:59:47.746421 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Feb 13 15:59:47.746426 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Feb 13 15:59:47.746431 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Feb 13 15:59:47.746436 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Feb 13 15:59:47.746441 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Feb 13 15:59:47.746446 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Feb 13 15:59:47.746452 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Feb 13 15:59:47.746458 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Feb 13 15:59:47.746463 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Feb 13 15:59:47.746468 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Feb 13 15:59:47.746473 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Feb 13 15:59:47.746479 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Feb 13 15:59:47.746484 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Feb 13 15:59:47.746489 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Feb 13 15:59:47.746495 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Feb 13 15:59:47.746500 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Feb 13 15:59:47.746505 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Feb 13 15:59:47.746511 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Feb 13 15:59:47.746518 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Feb 13 15:59:47.746523 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Feb 13 15:59:47.746528 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Feb 13 15:59:47.746534 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Feb 13 15:59:47.746539 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Feb 13 15:59:47.746544 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Feb 13 15:59:47.746550 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Feb 13 15:59:47.746555 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Feb 13 15:59:47.746561 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Feb 13 15:59:47.746566 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Feb 13 15:59:47.746572 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Feb 13 15:59:47.746579 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Feb 13 15:59:47.746584 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Feb 13 15:59:47.746589 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Feb 13 15:59:47.746595 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Feb 13 15:59:47.746600 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Feb 13 15:59:47.746606 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Feb 13 15:59:47.746611 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Feb 13 15:59:47.746616 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Feb 13 15:59:47.746621 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Feb 13 15:59:47.746627 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Feb 13 15:59:47.746635 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Feb 13 15:59:47.746645 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Feb 13 15:59:47.746652 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Feb 13 15:59:47.746658 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Feb 13 15:59:47.746665 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Feb 13 15:59:47.746674 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Feb 13 15:59:47.746681 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Feb 13 15:59:47.746686 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Feb 13 15:59:47.746694 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Feb 13 15:59:47.746701 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Feb 13 15:59:47.746707 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Feb 13 15:59:47.746714 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Feb 13 15:59:47.746720 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Feb 13 15:59:47.746727 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Feb 13 15:59:47.746733 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Feb 13 15:59:47.746740 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Feb 13 15:59:47.746746 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Feb 13 15:59:47.746752 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Feb 13 15:59:47.746759 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Feb 13 15:59:47.746765 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Feb 13 15:59:47.746772 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Feb 13 15:59:47.746778 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Feb 13 15:59:47.746784 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Feb 13 15:59:47.746789 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Feb 13 15:59:47.746795 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Feb 13 15:59:47.746802 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Feb 13 15:59:47.746807 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Feb 13 15:59:47.746813 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Feb 13 15:59:47.746821 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Feb 13 15:59:47.746827 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Feb 13 15:59:47.746832 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Feb 13 15:59:47.746838 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Feb 13 15:59:47.746846 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Feb 13 15:59:47.746852 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Feb 13 15:59:47.746859 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Feb 13 15:59:47.746864 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Feb 13 15:59:47.746870 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Feb 13 15:59:47.746876 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Feb 13 15:59:47.746883 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Feb 13 15:59:47.746890 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Feb 13 15:59:47.746896 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Feb 13 15:59:47.746902 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Feb 13 15:59:47.746909 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Feb 13 15:59:47.746915 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Feb 13 15:59:47.746921 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Feb 13 15:59:47.746927 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Feb 13 15:59:47.746932 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Feb 13 15:59:47.746939 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Feb 13 15:59:47.746946 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Feb 13 15:59:47.746953 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Feb 13 15:59:47.746960 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Feb 13 15:59:47.746967 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Feb 13 15:59:47.746972 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Feb 13 15:59:47.746978 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Feb 13 15:59:47.746984 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Feb 13 15:59:47.746990 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Feb 13 15:59:47.746997 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Feb 13 15:59:47.747003 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Feb 13 15:59:47.747010 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Feb 13 15:59:47.747018 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Feb 13 15:59:47.747024 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Feb 13 15:59:47.747031 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Feb 13 15:59:47.747036 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Feb 13 15:59:47.747042 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Feb 13 15:59:47.747047 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Feb 13 15:59:47.747053 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Feb 13 15:59:47.747060 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Feb 13 15:59:47.747065 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Feb 13 15:59:47.747071 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Feb 13 15:59:47.747076 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Feb 13 15:59:47.747083 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 15:59:47.747090 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 15:59:47.747096 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Feb 13 15:59:47.747102 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Feb 13 15:59:47.747109 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Feb 13 15:59:47.747115 kernel: Zone ranges: Feb 13 15:59:47.747121 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 15:59:47.747127 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Feb 13 15:59:47.747132 kernel: Normal empty Feb 13 15:59:47.747141 kernel: Movable zone start for each node Feb 13 15:59:47.747147 kernel: Early memory node ranges Feb 13 15:59:47.747153 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Feb 13 15:59:47.747159 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Feb 13 15:59:47.747164 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Feb 13 15:59:47.747171 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Feb 13 15:59:47.747177 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 15:59:47.747183 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Feb 13 15:59:47.747189 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Feb 13 15:59:47.747197 kernel: ACPI: PM-Timer IO Port: 0x1008 Feb 13 15:59:47.747204 kernel: system APIC only can use physical flat Feb 13 15:59:47.747209 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Feb 13 15:59:47.747216 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 15:59:47.747222 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 15:59:47.747227 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 15:59:47.747233 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 15:59:47.747238 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 15:59:47.747244 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 15:59:47.747256 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 15:59:47.747264 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 15:59:47.747270 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 15:59:47.747277 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 15:59:47.747282 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 15:59:47.747288 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 15:59:47.747293 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 15:59:47.747299 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 15:59:47.747304 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 15:59:47.747311 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 15:59:47.747316 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Feb 13 15:59:47.747324 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Feb 13 15:59:47.747329 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Feb 13 15:59:47.747335 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Feb 13 15:59:47.747341 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Feb 13 15:59:47.747346 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Feb 13 15:59:47.747353 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Feb 13 15:59:47.747358 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Feb 13 15:59:47.747364 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Feb 13 15:59:47.747370 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Feb 13 15:59:47.747376 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Feb 13 15:59:47.747382 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Feb 13 15:59:47.747387 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Feb 13 15:59:47.747393 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Feb 13 15:59:47.747398 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Feb 13 15:59:47.747404 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Feb 13 15:59:47.747409 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Feb 13 15:59:47.747415 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Feb 13 15:59:47.747421 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Feb 13 15:59:47.747427 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Feb 13 15:59:47.747434 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Feb 13 15:59:47.747439 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Feb 13 15:59:47.747445 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Feb 13 15:59:47.747451 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Feb 13 15:59:47.747457 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Feb 13 15:59:47.747463 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Feb 13 15:59:47.747469 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Feb 13 15:59:47.747474 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Feb 13 15:59:47.747481 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Feb 13 15:59:47.747488 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Feb 13 15:59:47.747495 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Feb 13 15:59:47.747501 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Feb 13 15:59:47.747508 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Feb 13 15:59:47.747513 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Feb 13 15:59:47.747520 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Feb 13 15:59:47.747527 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Feb 13 15:59:47.747532 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Feb 13 15:59:47.747538 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Feb 13 15:59:47.747545 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Feb 13 15:59:47.747551 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Feb 13 15:59:47.747558 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Feb 13 15:59:47.747564 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Feb 13 15:59:47.747569 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Feb 13 15:59:47.747575 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Feb 13 15:59:47.747581 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Feb 13 15:59:47.747587 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Feb 13 15:59:47.747592 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Feb 13 15:59:47.747599 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Feb 13 15:59:47.747605 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Feb 13 15:59:47.747612 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Feb 13 15:59:47.747617 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Feb 13 15:59:47.747624 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Feb 13 15:59:47.747630 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Feb 13 15:59:47.747638 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Feb 13 15:59:47.747644 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Feb 13 15:59:47.747650 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Feb 13 15:59:47.747655 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Feb 13 15:59:47.747660 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Feb 13 15:59:47.747667 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Feb 13 15:59:47.747674 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Feb 13 15:59:47.747681 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Feb 13 15:59:47.747687 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Feb 13 15:59:47.747693 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Feb 13 15:59:47.747699 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Feb 13 15:59:47.747704 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Feb 13 15:59:47.747710 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Feb 13 15:59:47.747716 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Feb 13 15:59:47.747722 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Feb 13 15:59:47.747728 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Feb 13 15:59:47.747735 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Feb 13 15:59:47.747740 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Feb 13 15:59:47.747746 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Feb 13 15:59:47.747752 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Feb 13 15:59:47.747759 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Feb 13 15:59:47.747764 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Feb 13 15:59:47.747771 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Feb 13 15:59:47.747776 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Feb 13 15:59:47.747782 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Feb 13 15:59:47.747787 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Feb 13 15:59:47.747794 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Feb 13 15:59:47.747800 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Feb 13 15:59:47.747806 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Feb 13 15:59:47.747812 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Feb 13 15:59:47.747817 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Feb 13 15:59:47.747823 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Feb 13 15:59:47.747828 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Feb 13 15:59:47.747834 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Feb 13 15:59:47.747840 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Feb 13 15:59:47.747845 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Feb 13 15:59:47.747852 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Feb 13 15:59:47.747857 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Feb 13 15:59:47.747863 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Feb 13 15:59:47.747868 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Feb 13 15:59:47.747874 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Feb 13 15:59:47.747879 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Feb 13 15:59:47.747885 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Feb 13 15:59:47.747891 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Feb 13 15:59:47.747902 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Feb 13 15:59:47.747909 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Feb 13 15:59:47.747915 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Feb 13 15:59:47.747921 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Feb 13 15:59:47.747926 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Feb 13 15:59:47.747932 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Feb 13 15:59:47.747937 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Feb 13 15:59:47.747944 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Feb 13 15:59:47.747950 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Feb 13 15:59:47.747955 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Feb 13 15:59:47.747961 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Feb 13 15:59:47.747968 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Feb 13 15:59:47.747974 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Feb 13 15:59:47.747979 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Feb 13 15:59:47.747986 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Feb 13 15:59:47.747992 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Feb 13 15:59:47.747998 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 15:59:47.748004 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Feb 13 15:59:47.748011 kernel: TSC deadline timer available Feb 13 15:59:47.748017 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Feb 13 15:59:47.748026 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Feb 13 15:59:47.748035 kernel: Booting paravirtualized kernel on VMware hypervisor Feb 13 15:59:47.748041 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 15:59:47.748047 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Feb 13 15:59:47.748053 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Feb 13 15:59:47.748060 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Feb 13 15:59:47.748066 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Feb 13 15:59:47.748072 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Feb 13 15:59:47.748077 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Feb 13 15:59:47.748085 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Feb 13 15:59:47.748091 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Feb 13 15:59:47.748105 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Feb 13 15:59:47.748112 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Feb 13 15:59:47.748118 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Feb 13 15:59:47.748124 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Feb 13 15:59:47.748131 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Feb 13 15:59:47.748137 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Feb 13 15:59:47.748142 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Feb 13 15:59:47.748150 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Feb 13 15:59:47.748155 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Feb 13 15:59:47.748162 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Feb 13 15:59:47.748169 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Feb 13 15:59:47.748175 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 15:59:47.748182 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 15:59:47.748188 kernel: random: crng init done Feb 13 15:59:47.748194 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Feb 13 15:59:47.748202 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Feb 13 15:59:47.748208 kernel: printk: log_buf_len min size: 262144 bytes Feb 13 15:59:47.748214 kernel: printk: log_buf_len: 1048576 bytes Feb 13 15:59:47.748220 kernel: printk: early log buf free: 239648(91%) Feb 13 15:59:47.748226 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 15:59:47.748232 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 15:59:47.748238 kernel: Fallback order for Node 0: 0 Feb 13 15:59:47.748244 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Feb 13 15:59:47.749323 kernel: Policy zone: DMA32 Feb 13 15:59:47.749343 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 15:59:47.749353 kernel: Memory: 1934308K/2096628K available (14336K kernel code, 2301K rwdata, 22852K rodata, 43476K init, 1596K bss, 162060K reserved, 0K cma-reserved) Feb 13 15:59:47.749362 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Feb 13 15:59:47.749369 kernel: ftrace: allocating 37893 entries in 149 pages Feb 13 15:59:47.749376 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 15:59:47.749384 kernel: Dynamic Preempt: voluntary Feb 13 15:59:47.749390 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 15:59:47.749396 kernel: rcu: RCU event tracing is enabled. Feb 13 15:59:47.749403 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Feb 13 15:59:47.749409 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 15:59:47.749416 kernel: Rude variant of Tasks RCU enabled. Feb 13 15:59:47.749422 kernel: Tracing variant of Tasks RCU enabled. Feb 13 15:59:47.749429 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 15:59:47.749435 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Feb 13 15:59:47.749441 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Feb 13 15:59:47.749448 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Feb 13 15:59:47.749455 kernel: Console: colour VGA+ 80x25 Feb 13 15:59:47.749460 kernel: printk: console [tty0] enabled Feb 13 15:59:47.749468 kernel: printk: console [ttyS0] enabled Feb 13 15:59:47.749474 kernel: ACPI: Core revision 20230628 Feb 13 15:59:47.749481 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Feb 13 15:59:47.749488 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 15:59:47.749495 kernel: x2apic enabled Feb 13 15:59:47.749502 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 15:59:47.749510 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 15:59:47.749517 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 15:59:47.749523 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Feb 13 15:59:47.749529 kernel: Disabled fast string operations Feb 13 15:59:47.749535 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 15:59:47.749544 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 15:59:47.749554 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 15:59:47.749564 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Feb 13 15:59:47.749573 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Feb 13 15:59:47.749586 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Feb 13 15:59:47.749596 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 15:59:47.749607 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 15:59:47.749617 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 15:59:47.749626 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 15:59:47.749637 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 15:59:47.749647 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 15:59:47.749657 kernel: SRBDS: Unknown: Dependent on hypervisor status Feb 13 15:59:47.749667 kernel: GDS: Unknown: Dependent on hypervisor status Feb 13 15:59:47.749681 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 15:59:47.749691 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 15:59:47.749700 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 15:59:47.749711 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 15:59:47.749720 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Feb 13 15:59:47.749730 kernel: Freeing SMP alternatives memory: 32K Feb 13 15:59:47.749737 kernel: pid_max: default: 131072 minimum: 1024 Feb 13 15:59:47.749747 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 15:59:47.749756 kernel: landlock: Up and running. Feb 13 15:59:47.749764 kernel: SELinux: Initializing. Feb 13 15:59:47.749770 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 15:59:47.749777 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 15:59:47.749783 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 15:59:47.749789 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 15:59:47.749795 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 15:59:47.749801 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 15:59:47.749807 kernel: Performance Events: Skylake events, core PMU driver. Feb 13 15:59:47.749814 kernel: core: CPUID marked event: 'cpu cycles' unavailable Feb 13 15:59:47.749820 kernel: core: CPUID marked event: 'instructions' unavailable Feb 13 15:59:47.749826 kernel: core: CPUID marked event: 'bus cycles' unavailable Feb 13 15:59:47.749832 kernel: core: CPUID marked event: 'cache references' unavailable Feb 13 15:59:47.749838 kernel: core: CPUID marked event: 'cache misses' unavailable Feb 13 15:59:47.749844 kernel: core: CPUID marked event: 'branch instructions' unavailable Feb 13 15:59:47.749850 kernel: core: CPUID marked event: 'branch misses' unavailable Feb 13 15:59:47.749857 kernel: ... version: 1 Feb 13 15:59:47.749863 kernel: ... bit width: 48 Feb 13 15:59:47.749870 kernel: ... generic registers: 4 Feb 13 15:59:47.749876 kernel: ... value mask: 0000ffffffffffff Feb 13 15:59:47.749882 kernel: ... max period: 000000007fffffff Feb 13 15:59:47.749888 kernel: ... fixed-purpose events: 0 Feb 13 15:59:47.749894 kernel: ... event mask: 000000000000000f Feb 13 15:59:47.749900 kernel: signal: max sigframe size: 1776 Feb 13 15:59:47.749906 kernel: rcu: Hierarchical SRCU implementation. Feb 13 15:59:47.749912 kernel: rcu: Max phase no-delay instances is 400. Feb 13 15:59:47.749918 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 15:59:47.749925 kernel: smp: Bringing up secondary CPUs ... Feb 13 15:59:47.749931 kernel: smpboot: x86: Booting SMP configuration: Feb 13 15:59:47.749937 kernel: .... node #0, CPUs: #1 Feb 13 15:59:47.749943 kernel: Disabled fast string operations Feb 13 15:59:47.749949 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Feb 13 15:59:47.749955 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Feb 13 15:59:47.749960 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 15:59:47.749967 kernel: smpboot: Max logical packages: 128 Feb 13 15:59:47.749973 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Feb 13 15:59:47.749979 kernel: devtmpfs: initialized Feb 13 15:59:47.749986 kernel: x86/mm: Memory block size: 128MB Feb 13 15:59:47.749992 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Feb 13 15:59:47.749998 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 15:59:47.750005 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Feb 13 15:59:47.750011 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 15:59:47.750017 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 15:59:47.750023 kernel: audit: initializing netlink subsys (disabled) Feb 13 15:59:47.750029 kernel: audit: type=2000 audit(1739462386.067:1): state=initialized audit_enabled=0 res=1 Feb 13 15:59:47.750035 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 15:59:47.750042 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 15:59:47.750048 kernel: cpuidle: using governor menu Feb 13 15:59:47.750054 kernel: Simple Boot Flag at 0x36 set to 0x80 Feb 13 15:59:47.750060 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 15:59:47.750066 kernel: dca service started, version 1.12.1 Feb 13 15:59:47.750072 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Feb 13 15:59:47.750078 kernel: PCI: Using configuration type 1 for base access Feb 13 15:59:47.750083 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 15:59:47.750089 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 15:59:47.750096 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 15:59:47.750102 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 15:59:47.750108 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 15:59:47.750115 kernel: ACPI: Added _OSI(Module Device) Feb 13 15:59:47.750121 kernel: ACPI: Added _OSI(Processor Device) Feb 13 15:59:47.750127 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 15:59:47.750133 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 15:59:47.750139 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 15:59:47.750145 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Feb 13 15:59:47.750153 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 15:59:47.750159 kernel: ACPI: Interpreter enabled Feb 13 15:59:47.750165 kernel: ACPI: PM: (supports S0 S1 S5) Feb 13 15:59:47.750171 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 15:59:47.750177 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 15:59:47.750183 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 15:59:47.750189 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Feb 13 15:59:47.750195 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Feb 13 15:59:47.750748 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 15:59:47.750816 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Feb 13 15:59:47.750872 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Feb 13 15:59:47.750882 kernel: PCI host bridge to bus 0000:00 Feb 13 15:59:47.750942 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 15:59:47.750990 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Feb 13 15:59:47.751035 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 13 15:59:47.751082 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 15:59:47.751129 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Feb 13 15:59:47.751175 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Feb 13 15:59:47.751238 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Feb 13 15:59:47.751324 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Feb 13 15:59:47.751386 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Feb 13 15:59:47.751447 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Feb 13 15:59:47.751502 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Feb 13 15:59:47.751557 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 15:59:47.751610 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 15:59:47.751661 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 15:59:47.751713 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 15:59:47.751769 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Feb 13 15:59:47.751824 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Feb 13 15:59:47.751876 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Feb 13 15:59:47.751935 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Feb 13 15:59:47.751988 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Feb 13 15:59:47.752041 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Feb 13 15:59:47.752100 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Feb 13 15:59:47.752156 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Feb 13 15:59:47.752208 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Feb 13 15:59:47.752276 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Feb 13 15:59:47.752330 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Feb 13 15:59:47.752382 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 15:59:47.752437 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Feb 13 15:59:47.752493 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.752548 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.752604 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.752659 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.752717 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.752770 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.752825 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.752881 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.752950 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753003 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753098 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753166 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753222 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753299 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753358 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753411 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753467 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753520 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753578 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753633 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753691 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753745 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753801 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753864 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.753924 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.753980 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754039 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754092 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754159 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754212 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754302 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754359 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754416 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754478 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754544 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754618 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754692 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754748 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754806 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754859 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.754923 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.754976 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755032 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755087 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755143 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755196 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755269 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755329 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755387 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755440 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755500 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755552 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755621 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755695 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755758 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755812 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755872 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.755924 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.755979 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.756031 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.756087 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.756140 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.756198 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:59:47.756334 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.756459 kernel: pci_bus 0000:01: extended config space not accessible Feb 13 15:59:47.756573 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 15:59:47.756660 kernel: pci_bus 0000:02: extended config space not accessible Feb 13 15:59:47.756675 kernel: acpiphp: Slot [32] registered Feb 13 15:59:47.756687 kernel: acpiphp: Slot [33] registered Feb 13 15:59:47.756701 kernel: acpiphp: Slot [34] registered Feb 13 15:59:47.756712 kernel: acpiphp: Slot [35] registered Feb 13 15:59:47.756722 kernel: acpiphp: Slot [36] registered Feb 13 15:59:47.756733 kernel: acpiphp: Slot [37] registered Feb 13 15:59:47.756743 kernel: acpiphp: Slot [38] registered Feb 13 15:59:47.756753 kernel: acpiphp: Slot [39] registered Feb 13 15:59:47.756764 kernel: acpiphp: Slot [40] registered Feb 13 15:59:47.756775 kernel: acpiphp: Slot [41] registered Feb 13 15:59:47.756785 kernel: acpiphp: Slot [42] registered Feb 13 15:59:47.756796 kernel: acpiphp: Slot [43] registered Feb 13 15:59:47.756806 kernel: acpiphp: Slot [44] registered Feb 13 15:59:47.756816 kernel: acpiphp: Slot [45] registered Feb 13 15:59:47.756822 kernel: acpiphp: Slot [46] registered Feb 13 15:59:47.756828 kernel: acpiphp: Slot [47] registered Feb 13 15:59:47.756834 kernel: acpiphp: Slot [48] registered Feb 13 15:59:47.756840 kernel: acpiphp: Slot [49] registered Feb 13 15:59:47.756846 kernel: acpiphp: Slot [50] registered Feb 13 15:59:47.756851 kernel: acpiphp: Slot [51] registered Feb 13 15:59:47.756857 kernel: acpiphp: Slot [52] registered Feb 13 15:59:47.756865 kernel: acpiphp: Slot [53] registered Feb 13 15:59:47.756871 kernel: acpiphp: Slot [54] registered Feb 13 15:59:47.756877 kernel: acpiphp: Slot [55] registered Feb 13 15:59:47.756883 kernel: acpiphp: Slot [56] registered Feb 13 15:59:47.756888 kernel: acpiphp: Slot [57] registered Feb 13 15:59:47.756894 kernel: acpiphp: Slot [58] registered Feb 13 15:59:47.756900 kernel: acpiphp: Slot [59] registered Feb 13 15:59:47.756906 kernel: acpiphp: Slot [60] registered Feb 13 15:59:47.756915 kernel: acpiphp: Slot [61] registered Feb 13 15:59:47.756927 kernel: acpiphp: Slot [62] registered Feb 13 15:59:47.756934 kernel: acpiphp: Slot [63] registered Feb 13 15:59:47.756997 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Feb 13 15:59:47.757052 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 15:59:47.757104 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 15:59:47.757155 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 15:59:47.757206 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Feb 13 15:59:47.757347 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Feb 13 15:59:47.757405 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Feb 13 15:59:47.757455 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Feb 13 15:59:47.757528 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Feb 13 15:59:47.757611 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Feb 13 15:59:47.757667 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Feb 13 15:59:47.757719 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Feb 13 15:59:47.757773 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 15:59:47.757828 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 15:59:47.757881 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 15:59:47.757943 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 15:59:47.757996 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 15:59:47.758047 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 15:59:47.758101 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 15:59:47.758152 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 15:59:47.758204 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 15:59:47.758270 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 15:59:47.758329 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 15:59:47.758381 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 15:59:47.758433 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 15:59:47.758484 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 15:59:47.758538 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 15:59:47.758590 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 15:59:47.758645 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 15:59:47.758699 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 15:59:47.758751 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 15:59:47.759285 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 15:59:47.759351 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 15:59:47.759408 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 15:59:47.759460 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 15:59:47.759514 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 15:59:47.759568 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 15:59:47.759620 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 15:59:47.759673 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 15:59:47.759726 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 15:59:47.759778 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 15:59:47.759849 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Feb 13 15:59:47.759905 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Feb 13 15:59:47.759963 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Feb 13 15:59:47.760017 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Feb 13 15:59:47.760070 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Feb 13 15:59:47.760140 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 15:59:47.760195 kernel: pci 0000:0b:00.0: supports D1 D2 Feb 13 15:59:47.761268 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 15:59:47.761330 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 15:59:47.761383 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 15:59:47.761436 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 15:59:47.761487 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 15:59:47.761541 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 15:59:47.761591 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 15:59:47.761643 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 15:59:47.761701 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 15:59:47.761770 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 15:59:47.761830 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 15:59:47.761883 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 15:59:47.761941 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 15:59:47.761997 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 15:59:47.762050 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 15:59:47.762106 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 15:59:47.762160 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 15:59:47.762212 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 15:59:47.763439 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 15:59:47.763506 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 15:59:47.763559 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 15:59:47.763612 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 15:59:47.763668 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 15:59:47.763719 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 15:59:47.763776 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 15:59:47.763830 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 15:59:47.763882 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 15:59:47.763933 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 15:59:47.763986 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 15:59:47.764038 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 15:59:47.764088 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 15:59:47.764139 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 15:59:47.764197 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 15:59:47.765548 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 15:59:47.765622 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 15:59:47.765675 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 15:59:47.765729 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 15:59:47.765782 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 15:59:47.765834 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 15:59:47.765890 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 15:59:47.765960 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 15:59:47.766017 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 15:59:47.766079 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 15:59:47.766134 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 15:59:47.766186 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 15:59:47.766236 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 15:59:47.766368 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 15:59:47.766424 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 15:59:47.766476 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 15:59:47.766530 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 15:59:47.766584 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 15:59:47.766637 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 15:59:47.766696 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 15:59:47.766750 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 15:59:47.766801 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 15:59:47.766856 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 15:59:47.766908 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 15:59:47.766959 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 15:59:47.767010 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 15:59:47.767063 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 15:59:47.767116 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 15:59:47.767167 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 15:59:47.767219 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 15:59:47.767288 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 15:59:47.767342 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 15:59:47.767394 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 15:59:47.767447 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 15:59:47.767499 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 15:59:47.767549 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 15:59:47.767608 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 15:59:47.767664 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 15:59:47.767716 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 15:59:47.767771 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 15:59:47.767822 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 15:59:47.767873 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 15:59:47.767934 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 15:59:47.768013 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 15:59:47.768066 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 15:59:47.768125 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 15:59:47.768177 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 15:59:47.768228 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 15:59:47.768237 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Feb 13 15:59:47.768243 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Feb 13 15:59:47.768263 kernel: ACPI: PCI: Interrupt link LNKB disabled Feb 13 15:59:47.768270 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 15:59:47.768276 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Feb 13 15:59:47.768282 kernel: iommu: Default domain type: Translated Feb 13 15:59:47.768290 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 15:59:47.768296 kernel: PCI: Using ACPI for IRQ routing Feb 13 15:59:47.768302 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 15:59:47.768309 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Feb 13 15:59:47.768315 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Feb 13 15:59:47.768370 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Feb 13 15:59:47.768422 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Feb 13 15:59:47.768474 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 15:59:47.768483 kernel: vgaarb: loaded Feb 13 15:59:47.768492 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Feb 13 15:59:47.768498 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Feb 13 15:59:47.768504 kernel: clocksource: Switched to clocksource tsc-early Feb 13 15:59:47.768510 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 15:59:47.768516 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 15:59:47.768522 kernel: pnp: PnP ACPI init Feb 13 15:59:47.768576 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Feb 13 15:59:47.768634 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Feb 13 15:59:47.768685 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Feb 13 15:59:47.768735 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Feb 13 15:59:47.768787 kernel: pnp 00:06: [dma 2] Feb 13 15:59:47.768842 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Feb 13 15:59:47.768891 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Feb 13 15:59:47.768938 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Feb 13 15:59:47.768947 kernel: pnp: PnP ACPI: found 8 devices Feb 13 15:59:47.768955 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 15:59:47.768961 kernel: NET: Registered PF_INET protocol family Feb 13 15:59:47.768968 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 15:59:47.768974 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 15:59:47.768980 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 15:59:47.768986 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 15:59:47.768992 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 15:59:47.768998 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 15:59:47.769005 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 15:59:47.769011 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 15:59:47.769017 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 15:59:47.769023 kernel: NET: Registered PF_XDP protocol family Feb 13 15:59:47.769078 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Feb 13 15:59:47.769133 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Feb 13 15:59:47.769188 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Feb 13 15:59:47.769243 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Feb 13 15:59:47.769977 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Feb 13 15:59:47.770063 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Feb 13 15:59:47.770142 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Feb 13 15:59:47.770200 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Feb 13 15:59:47.770275 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Feb 13 15:59:47.770337 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Feb 13 15:59:47.770391 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Feb 13 15:59:47.770446 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Feb 13 15:59:47.770502 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Feb 13 15:59:47.770556 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Feb 13 15:59:47.770610 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Feb 13 15:59:47.770671 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Feb 13 15:59:47.770724 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Feb 13 15:59:47.770778 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Feb 13 15:59:47.770830 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Feb 13 15:59:47.770883 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Feb 13 15:59:47.770942 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Feb 13 15:59:47.770998 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Feb 13 15:59:47.771050 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Feb 13 15:59:47.771103 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 15:59:47.771155 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 15:59:47.771207 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.771322 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.771376 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.771431 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.771483 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.771535 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.771586 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.771642 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.771695 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.771747 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.771799 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.771854 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.771906 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.771957 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772009 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772061 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772114 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772166 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772218 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772280 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772334 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772385 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772437 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772488 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772541 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772593 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772649 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772705 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.772759 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.772967 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773147 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.773203 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773299 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.773353 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773406 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.773463 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773517 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.773570 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773622 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.773680 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773735 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.773787 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773839 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.773898 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.773952 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774004 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774055 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774107 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774159 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774211 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774324 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774377 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774430 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774486 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774537 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774588 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774639 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774696 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774748 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774799 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774851 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.774913 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.774970 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775022 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775075 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775127 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775179 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775230 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775298 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775350 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775403 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775455 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775510 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775562 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775615 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775667 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775721 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775774 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775827 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775888 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.775943 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.775998 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.776051 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 15:59:47.776104 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:59:47.776158 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 15:59:47.776211 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Feb 13 15:59:47.776344 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 15:59:47.776396 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 15:59:47.778331 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 15:59:47.778401 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Feb 13 15:59:47.778461 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 15:59:47.778515 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 15:59:47.778567 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 15:59:47.778619 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 15:59:47.778673 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 15:59:47.778725 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 15:59:47.778777 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 15:59:47.778829 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 15:59:47.778883 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 15:59:47.778945 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 15:59:47.778997 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 15:59:47.779049 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 15:59:47.779102 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 15:59:47.779153 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 15:59:47.779205 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 15:59:47.779638 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 15:59:47.779722 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 15:59:47.779781 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 15:59:47.779850 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 15:59:47.779909 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 15:59:47.779962 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 15:59:47.780016 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 15:59:47.780104 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 15:59:47.780168 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 15:59:47.780228 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 15:59:47.780322 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 15:59:47.780378 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 15:59:47.780459 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Feb 13 15:59:47.780520 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 15:59:47.780602 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 15:59:47.780668 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 15:59:47.780734 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 15:59:47.780812 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 15:59:47.780871 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 15:59:47.780935 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 15:59:47.780989 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 15:59:47.781044 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 15:59:47.781101 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 15:59:47.781171 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 15:59:47.781228 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 15:59:47.781397 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 15:59:47.781452 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 15:59:47.781505 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 15:59:47.781563 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 15:59:47.781621 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 15:59:47.781684 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 15:59:47.781740 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 15:59:47.781800 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 15:59:47.781852 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 15:59:47.781905 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 15:59:47.781957 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 15:59:47.782008 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 15:59:47.782074 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 15:59:47.782160 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 15:59:47.782230 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 15:59:47.782336 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 15:59:47.782421 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 15:59:47.782494 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 15:59:47.782597 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 15:59:47.782688 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 15:59:47.782772 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 15:59:47.782830 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 15:59:47.782896 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 15:59:47.782969 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 15:59:47.783026 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 15:59:47.783086 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 15:59:47.783172 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 15:59:47.783245 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 15:59:47.783321 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 15:59:47.783400 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 15:59:47.783471 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 15:59:47.783533 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 15:59:47.783613 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 15:59:47.783700 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 15:59:47.783765 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 15:59:47.783818 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 15:59:47.783872 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 15:59:47.783931 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 15:59:47.783985 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 15:59:47.784039 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 15:59:47.784093 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 15:59:47.784176 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 15:59:47.784232 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 15:59:47.784346 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 15:59:47.784428 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 15:59:47.784513 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 15:59:47.784600 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 15:59:47.784666 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 15:59:47.784720 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 15:59:47.784771 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 15:59:47.784833 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 15:59:47.784901 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 15:59:47.784975 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 15:59:47.785031 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 15:59:47.785087 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 15:59:47.785165 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 15:59:47.785237 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 15:59:47.785300 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 15:59:47.785353 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 15:59:47.785415 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 15:59:47.785494 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 15:59:47.785565 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 15:59:47.785632 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 15:59:47.785708 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 15:59:47.785793 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 15:59:47.785875 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 15:59:47.785952 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 15:59:47.786022 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 15:59:47.786076 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 15:59:47.786129 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 15:59:47.786175 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 15:59:47.786220 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Feb 13 15:59:47.786280 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Feb 13 15:59:47.786333 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Feb 13 15:59:47.786382 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Feb 13 15:59:47.786430 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 15:59:47.786481 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 15:59:47.786529 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 15:59:47.786577 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 15:59:47.786624 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Feb 13 15:59:47.786672 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Feb 13 15:59:47.786725 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Feb 13 15:59:47.786774 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Feb 13 15:59:47.786821 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 15:59:47.786879 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Feb 13 15:59:47.786946 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Feb 13 15:59:47.787006 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 15:59:47.787060 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Feb 13 15:59:47.787122 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Feb 13 15:59:47.787181 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 15:59:47.787236 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Feb 13 15:59:47.787309 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 15:59:47.787364 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Feb 13 15:59:47.787412 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 15:59:47.787464 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Feb 13 15:59:47.787512 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 15:59:47.787563 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Feb 13 15:59:47.787615 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 15:59:47.787670 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Feb 13 15:59:47.787726 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 15:59:47.787781 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Feb 13 15:59:47.787831 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Feb 13 15:59:47.787881 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 15:59:47.787940 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Feb 13 15:59:47.787989 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Feb 13 15:59:47.788036 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 15:59:47.788088 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Feb 13 15:59:47.788136 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Feb 13 15:59:47.788183 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 15:59:47.788238 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Feb 13 15:59:47.788352 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 15:59:47.788405 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Feb 13 15:59:47.788453 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 15:59:47.788504 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Feb 13 15:59:47.788551 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 15:59:47.788606 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Feb 13 15:59:47.788655 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 15:59:47.788706 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Feb 13 15:59:47.788753 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 15:59:47.788805 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Feb 13 15:59:47.788855 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Feb 13 15:59:47.788913 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 15:59:47.788978 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Feb 13 15:59:47.789026 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Feb 13 15:59:47.789074 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 15:59:47.789128 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Feb 13 15:59:47.789176 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Feb 13 15:59:47.789222 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 15:59:47.789286 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Feb 13 15:59:47.789346 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 15:59:47.789398 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Feb 13 15:59:47.789460 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 15:59:47.789517 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Feb 13 15:59:47.789573 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 15:59:47.789636 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Feb 13 15:59:47.789697 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 15:59:47.789751 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Feb 13 15:59:47.789810 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 15:59:47.789875 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Feb 13 15:59:47.789929 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Feb 13 15:59:47.789997 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 15:59:47.790062 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Feb 13 15:59:47.790112 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Feb 13 15:59:47.790159 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 15:59:47.790213 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Feb 13 15:59:47.790445 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 15:59:47.790500 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Feb 13 15:59:47.790558 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 15:59:47.790611 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Feb 13 15:59:47.790658 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 15:59:47.790710 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Feb 13 15:59:47.790758 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 15:59:47.790809 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Feb 13 15:59:47.790860 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 15:59:47.790919 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Feb 13 15:59:47.790966 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 15:59:47.791023 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 15:59:47.791034 kernel: PCI: CLS 32 bytes, default 64 Feb 13 15:59:47.791041 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 15:59:47.791048 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 15:59:47.791057 kernel: clocksource: Switched to clocksource tsc Feb 13 15:59:47.791063 kernel: Initialise system trusted keyrings Feb 13 15:59:47.791070 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 15:59:47.791076 kernel: Key type asymmetric registered Feb 13 15:59:47.791082 kernel: Asymmetric key parser 'x509' registered Feb 13 15:59:47.791089 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 15:59:47.791095 kernel: io scheduler mq-deadline registered Feb 13 15:59:47.791101 kernel: io scheduler kyber registered Feb 13 15:59:47.791108 kernel: io scheduler bfq registered Feb 13 15:59:47.791164 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Feb 13 15:59:47.791219 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.791282 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Feb 13 15:59:47.791336 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.791391 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Feb 13 15:59:47.791457 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.791513 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Feb 13 15:59:47.791570 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.791625 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Feb 13 15:59:47.791678 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.791731 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Feb 13 15:59:47.791786 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.791844 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Feb 13 15:59:47.791900 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.791964 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Feb 13 15:59:47.792017 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792072 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Feb 13 15:59:47.792126 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792213 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Feb 13 15:59:47.792303 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792359 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Feb 13 15:59:47.792411 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792465 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Feb 13 15:59:47.792517 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792572 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Feb 13 15:59:47.792627 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792682 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Feb 13 15:59:47.792735 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792789 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Feb 13 15:59:47.792842 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.792900 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Feb 13 15:59:47.792955 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793009 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Feb 13 15:59:47.793062 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793115 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Feb 13 15:59:47.793172 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793240 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Feb 13 15:59:47.793318 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793383 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Feb 13 15:59:47.793449 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793510 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Feb 13 15:59:47.793571 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793638 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Feb 13 15:59:47.793702 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793764 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Feb 13 15:59:47.793821 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.793885 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Feb 13 15:59:47.793955 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.794034 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Feb 13 15:59:47.794110 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.794188 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Feb 13 15:59:47.794288 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.794374 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Feb 13 15:59:47.794456 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.794535 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Feb 13 15:59:47.794606 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.794672 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Feb 13 15:59:47.794736 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.794796 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Feb 13 15:59:47.794865 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.794951 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Feb 13 15:59:47.795018 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.795083 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Feb 13 15:59:47.795146 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:59:47.795157 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 15:59:47.795167 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 15:59:47.795174 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 15:59:47.795180 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Feb 13 15:59:47.795187 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 15:59:47.795193 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 15:59:47.795611 kernel: rtc_cmos 00:01: registered as rtc0 Feb 13 15:59:47.795676 kernel: rtc_cmos 00:01: setting system clock to 2025-02-13T15:59:47 UTC (1739462387) Feb 13 15:59:47.795737 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Feb 13 15:59:47.795751 kernel: intel_pstate: CPU model not supported Feb 13 15:59:47.795760 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 15:59:47.795771 kernel: NET: Registered PF_INET6 protocol family Feb 13 15:59:47.795780 kernel: Segment Routing with IPv6 Feb 13 15:59:47.795787 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 15:59:47.795793 kernel: NET: Registered PF_PACKET protocol family Feb 13 15:59:47.795800 kernel: Key type dns_resolver registered Feb 13 15:59:47.795806 kernel: IPI shorthand broadcast: enabled Feb 13 15:59:47.795813 kernel: sched_clock: Marking stable (925295540, 227964875)->(1213971460, -60711045) Feb 13 15:59:47.795825 kernel: registered taskstats version 1 Feb 13 15:59:47.795832 kernel: Loading compiled-in X.509 certificates Feb 13 15:59:47.795838 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: a260c8876205efb4ca2ab3eb040cd310ec7afd21' Feb 13 15:59:47.795849 kernel: Key type .fscrypt registered Feb 13 15:59:47.795860 kernel: Key type fscrypt-provisioning registered Feb 13 15:59:47.795866 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 15:59:47.795872 kernel: ima: Allocated hash algorithm: sha1 Feb 13 15:59:47.795879 kernel: ima: No architecture policies found Feb 13 15:59:47.795887 kernel: clk: Disabling unused clocks Feb 13 15:59:47.795897 kernel: Freeing unused kernel image (initmem) memory: 43476K Feb 13 15:59:47.795907 kernel: Write protecting the kernel read-only data: 38912k Feb 13 15:59:47.795914 kernel: Freeing unused kernel image (rodata/data gap) memory: 1724K Feb 13 15:59:47.795920 kernel: Run /init as init process Feb 13 15:59:47.795931 kernel: with arguments: Feb 13 15:59:47.795943 kernel: /init Feb 13 15:59:47.795949 kernel: with environment: Feb 13 15:59:47.795955 kernel: HOME=/ Feb 13 15:59:47.795962 kernel: TERM=linux Feb 13 15:59:47.795970 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 15:59:47.795977 systemd[1]: Successfully made /usr/ read-only. Feb 13 15:59:47.795985 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 15:59:47.795992 systemd[1]: Detected virtualization vmware. Feb 13 15:59:47.795999 systemd[1]: Detected architecture x86-64. Feb 13 15:59:47.796007 systemd[1]: Running in initrd. Feb 13 15:59:47.796018 systemd[1]: No hostname configured, using default hostname. Feb 13 15:59:47.796030 systemd[1]: Hostname set to . Feb 13 15:59:47.796036 systemd[1]: Initializing machine ID from random generator. Feb 13 15:59:47.796043 systemd[1]: Queued start job for default target initrd.target. Feb 13 15:59:47.796049 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:59:47.796056 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:59:47.796063 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 15:59:47.796069 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 15:59:47.796076 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 15:59:47.796089 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 15:59:47.796102 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 15:59:47.796115 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 15:59:47.796126 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:59:47.796134 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:59:47.796141 systemd[1]: Reached target paths.target - Path Units. Feb 13 15:59:47.796147 systemd[1]: Reached target slices.target - Slice Units. Feb 13 15:59:47.796160 systemd[1]: Reached target swap.target - Swaps. Feb 13 15:59:47.796170 systemd[1]: Reached target timers.target - Timer Units. Feb 13 15:59:47.796182 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 15:59:47.796193 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 15:59:47.796204 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 15:59:47.796214 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Feb 13 15:59:47.796225 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:59:47.796236 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 15:59:47.796346 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:59:47.796360 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 15:59:47.796367 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 15:59:47.796375 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 15:59:47.796389 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 15:59:47.796401 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 15:59:47.796412 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 15:59:47.796424 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 15:59:47.796435 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:59:47.796449 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 15:59:47.796460 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:59:47.796473 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 15:59:47.796508 systemd-journald[216]: Collecting audit messages is disabled. Feb 13 15:59:47.796537 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 15:59:47.796549 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 15:59:47.796561 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:59:47.796573 kernel: Bridge firewalling registered Feb 13 15:59:47.796585 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:59:47.796599 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 15:59:47.796611 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 15:59:47.796623 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 15:59:47.796635 systemd-journald[216]: Journal started Feb 13 15:59:47.796659 systemd-journald[216]: Runtime Journal (/run/log/journal/269be8d8c9a447ba900345f4ee8489f0) is 4.8M, max 38.6M, 33.8M free. Feb 13 15:59:47.756784 systemd-modules-load[217]: Inserted module 'overlay' Feb 13 15:59:47.780118 systemd-modules-load[217]: Inserted module 'br_netfilter' Feb 13 15:59:47.806263 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 15:59:47.808273 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 15:59:47.809519 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:59:47.809920 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:59:47.811868 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 15:59:47.814234 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 15:59:47.814496 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:59:47.821112 dracut-cmdline[246]: dracut-dracut-053 Feb 13 15:59:47.821695 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:59:47.822988 dracut-cmdline[246]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 15:59:47.826375 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 15:59:47.847139 systemd-resolved[262]: Positive Trust Anchors: Feb 13 15:59:47.847419 systemd-resolved[262]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 15:59:47.847577 systemd-resolved[262]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 15:59:47.850012 systemd-resolved[262]: Defaulting to hostname 'linux'. Feb 13 15:59:47.850811 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 15:59:47.850960 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:59:47.868274 kernel: SCSI subsystem initialized Feb 13 15:59:47.874275 kernel: Loading iSCSI transport class v2.0-870. Feb 13 15:59:47.881345 kernel: iscsi: registered transport (tcp) Feb 13 15:59:47.894552 kernel: iscsi: registered transport (qla4xxx) Feb 13 15:59:47.894595 kernel: QLogic iSCSI HBA Driver Feb 13 15:59:47.914963 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 15:59:47.919349 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 15:59:47.933567 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 15:59:47.933609 kernel: device-mapper: uevent: version 1.0.3 Feb 13 15:59:47.934635 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 15:59:47.967277 kernel: raid6: avx2x4 gen() 46867 MB/s Feb 13 15:59:47.982267 kernel: raid6: avx2x2 gen() 52853 MB/s Feb 13 15:59:47.999479 kernel: raid6: avx2x1 gen() 44654 MB/s Feb 13 15:59:47.999509 kernel: raid6: using algorithm avx2x2 gen() 52853 MB/s Feb 13 15:59:48.017523 kernel: raid6: .... xor() 31862 MB/s, rmw enabled Feb 13 15:59:48.017547 kernel: raid6: using avx2x2 recovery algorithm Feb 13 15:59:48.030263 kernel: xor: automatically using best checksumming function avx Feb 13 15:59:48.122276 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 15:59:48.127490 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 15:59:48.132343 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:59:48.140774 systemd-udevd[435]: Using default interface naming scheme 'v255'. Feb 13 15:59:48.143921 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:59:48.151335 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 15:59:48.158035 dracut-pre-trigger[437]: rd.md=0: removing MD RAID activation Feb 13 15:59:48.173831 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 15:59:48.178342 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 15:59:48.248948 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:59:48.251755 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 15:59:48.266537 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 15:59:48.267026 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 15:59:48.267149 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:59:48.267238 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 15:59:48.271375 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 15:59:48.280570 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 15:59:48.319283 kernel: VMware PVSCSI driver - version 1.0.7.0-k Feb 13 15:59:48.326531 kernel: vmw_pvscsi: using 64bit dma Feb 13 15:59:48.326559 kernel: vmw_pvscsi: max_id: 16 Feb 13 15:59:48.326572 kernel: vmw_pvscsi: setting ring_pages to 8 Feb 13 15:59:48.331264 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Feb 13 15:59:48.333410 kernel: vmw_pvscsi: enabling reqCallThreshold Feb 13 15:59:48.333430 kernel: vmw_pvscsi: driver-based request coalescing enabled Feb 13 15:59:48.333440 kernel: vmw_pvscsi: using MSI-X Feb 13 15:59:48.333447 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Feb 13 15:59:48.338278 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Feb 13 15:59:48.345014 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Feb 13 15:59:48.349478 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Feb 13 15:59:48.349562 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Feb 13 15:59:48.355307 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Feb 13 15:59:48.356277 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 15:59:48.363234 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 15:59:48.363487 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:59:48.363799 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:59:48.363919 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:59:48.363990 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:59:48.364520 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:59:48.367286 kernel: libata version 3.00 loaded. Feb 13 15:59:48.370403 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:59:48.371263 kernel: ata_piix 0000:00:07.1: version 2.13 Feb 13 15:59:48.376224 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 15:59:48.376235 kernel: AES CTR mode by8 optimization enabled Feb 13 15:59:48.376243 kernel: scsi host1: ata_piix Feb 13 15:59:48.376347 kernel: scsi host2: ata_piix Feb 13 15:59:48.376410 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Feb 13 15:59:48.376418 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Feb 13 15:59:48.386376 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:59:48.390368 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:59:48.399745 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:59:48.543276 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Feb 13 15:59:48.549299 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Feb 13 15:59:48.561344 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Feb 13 15:59:48.570427 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 15:59:48.570515 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Feb 13 15:59:48.570579 kernel: sd 0:0:0:0: [sda] Cache data unavailable Feb 13 15:59:48.570639 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Feb 13 15:59:48.570698 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 15:59:48.570707 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Feb 13 15:59:48.587192 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 15:59:48.587459 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 13 15:59:48.587470 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Feb 13 15:59:48.629381 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (497) Feb 13 15:59:48.636384 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Feb 13 15:59:48.639261 kernel: BTRFS: device fsid 506754f7-5ef1-4c63-ad2a-b7b855a48f85 devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (493) Feb 13 15:59:48.641980 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 15:59:48.647702 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Feb 13 15:59:48.652312 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Feb 13 15:59:48.652463 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Feb 13 15:59:48.656351 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 15:59:48.680518 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 15:59:48.686285 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 15:59:49.750265 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 15:59:49.750568 disk-uuid[595]: The operation has completed successfully. Feb 13 15:59:50.039865 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 15:59:50.039941 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 15:59:50.051477 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 15:59:50.055978 sh[611]: Success Feb 13 15:59:50.071422 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 15:59:50.120731 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 15:59:50.126332 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 15:59:50.126696 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 15:59:50.149211 kernel: BTRFS info (device dm-0): first mount of filesystem 506754f7-5ef1-4c63-ad2a-b7b855a48f85 Feb 13 15:59:50.149293 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:59:50.149314 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 15:59:50.151295 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 15:59:50.151333 kernel: BTRFS info (device dm-0): using free space tree Feb 13 15:59:50.160449 kernel: BTRFS info (device dm-0): enabling ssd optimizations Feb 13 15:59:50.162937 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 15:59:50.170763 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Feb 13 15:59:50.172233 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 15:59:50.210207 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:59:50.210267 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:59:50.210278 kernel: BTRFS info (device sda6): using free space tree Feb 13 15:59:50.217265 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 15:59:50.224056 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 15:59:50.226290 kernel: BTRFS info (device sda6): last unmount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:59:50.234788 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 15:59:50.243459 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 15:59:50.311651 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 15:59:50.315370 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 15:59:50.370440 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 15:59:50.380389 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 15:59:50.392576 ignition[673]: Ignition 2.20.0 Feb 13 15:59:50.392583 ignition[673]: Stage: fetch-offline Feb 13 15:59:50.392607 ignition[673]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:59:50.392612 ignition[673]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:59:50.392666 ignition[673]: parsed url from cmdline: "" Feb 13 15:59:50.392668 ignition[673]: no config URL provided Feb 13 15:59:50.392671 ignition[673]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 15:59:50.392675 ignition[673]: no config at "/usr/lib/ignition/user.ign" Feb 13 15:59:50.393042 ignition[673]: config successfully fetched Feb 13 15:59:50.393058 ignition[673]: parsing config with SHA512: ce463a77c3730a0770ec8629d5a8431e1b592597e8b5c86694d9f1903f98e3e226734413dbe11e370e54b1d88dd16d5a83f2106b666c7400f195355336214e6b Feb 13 15:59:50.395601 unknown[673]: fetched base config from "system" Feb 13 15:59:50.395609 unknown[673]: fetched user config from "vmware" Feb 13 15:59:50.396399 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 15:59:50.395821 ignition[673]: fetch-offline: fetch-offline passed Feb 13 15:59:50.395863 ignition[673]: Ignition finished successfully Feb 13 15:59:50.400414 systemd-networkd[805]: lo: Link UP Feb 13 15:59:50.400581 systemd-networkd[805]: lo: Gained carrier Feb 13 15:59:50.401523 systemd-networkd[805]: Enumeration completed Feb 13 15:59:50.401685 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 15:59:50.401909 systemd-networkd[805]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Feb 13 15:59:50.402032 systemd[1]: Reached target network.target - Network. Feb 13 15:59:50.402153 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 15:59:50.405456 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 15:59:50.405584 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 15:59:50.406449 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 15:59:50.406769 systemd-networkd[805]: ens192: Link UP Feb 13 15:59:50.406773 systemd-networkd[805]: ens192: Gained carrier Feb 13 15:59:50.415857 ignition[809]: Ignition 2.20.0 Feb 13 15:59:50.415866 ignition[809]: Stage: kargs Feb 13 15:59:50.415994 ignition[809]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:59:50.416000 ignition[809]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:59:50.416604 ignition[809]: kargs: kargs passed Feb 13 15:59:50.416651 ignition[809]: Ignition finished successfully Feb 13 15:59:50.417568 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 15:59:50.425374 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 15:59:50.431955 ignition[816]: Ignition 2.20.0 Feb 13 15:59:50.431963 ignition[816]: Stage: disks Feb 13 15:59:50.432067 ignition[816]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:59:50.432073 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:59:50.432585 ignition[816]: disks: disks passed Feb 13 15:59:50.432611 ignition[816]: Ignition finished successfully Feb 13 15:59:50.433419 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 15:59:50.433842 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 15:59:50.433979 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 15:59:50.434159 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 15:59:50.434348 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 15:59:50.434530 systemd[1]: Reached target basic.target - Basic System. Feb 13 15:59:50.442446 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 15:59:50.663836 systemd-resolved[262]: Detected conflict on linux IN A 139.178.70.107 Feb 13 15:59:50.663846 systemd-resolved[262]: Hostname conflict, changing published hostname from 'linux' to 'linux2'. Feb 13 15:59:50.721439 systemd-fsck[824]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Feb 13 15:59:50.730353 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 15:59:51.155360 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 15:59:51.307220 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 15:59:51.307492 kernel: EXT4-fs (sda9): mounted filesystem 8023eced-1511-4e72-a58a-db1b8cb3210e r/w with ordered data mode. Quota mode: none. Feb 13 15:59:51.307713 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 15:59:51.314420 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 15:59:51.317770 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 15:59:51.318231 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 15:59:51.318293 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 15:59:51.318319 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 15:59:51.323138 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 15:59:51.324541 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 15:59:51.346277 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (832) Feb 13 15:59:51.348272 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:59:51.350353 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:59:51.350423 kernel: BTRFS info (device sda6): using free space tree Feb 13 15:59:51.357283 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 15:59:51.359628 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 15:59:51.415896 initrd-setup-root[856]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 15:59:51.419947 initrd-setup-root[863]: cut: /sysroot/etc/group: No such file or directory Feb 13 15:59:51.423683 initrd-setup-root[870]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 15:59:51.431329 initrd-setup-root[877]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 15:59:51.768449 systemd-networkd[805]: ens192: Gained IPv6LL Feb 13 15:59:51.779943 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 15:59:51.785376 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 15:59:51.787349 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 15:59:51.793280 kernel: BTRFS info (device sda6): last unmount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:59:51.813824 ignition[950]: INFO : Ignition 2.20.0 Feb 13 15:59:51.813824 ignition[950]: INFO : Stage: mount Feb 13 15:59:51.815140 ignition[950]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:59:51.815140 ignition[950]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:59:51.815140 ignition[950]: INFO : mount: mount passed Feb 13 15:59:51.815140 ignition[950]: INFO : Ignition finished successfully Feb 13 15:59:51.816820 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 15:59:51.820401 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 15:59:51.823863 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 15:59:52.147050 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 15:59:52.151392 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 15:59:52.208312 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (962) Feb 13 15:59:52.212885 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:59:52.212928 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:59:52.212940 kernel: BTRFS info (device sda6): using free space tree Feb 13 15:59:52.217268 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 15:59:52.217961 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 15:59:52.234910 ignition[979]: INFO : Ignition 2.20.0 Feb 13 15:59:52.235981 ignition[979]: INFO : Stage: files Feb 13 15:59:52.235981 ignition[979]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:59:52.235981 ignition[979]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:59:52.235981 ignition[979]: DEBUG : files: compiled without relabeling support, skipping Feb 13 15:59:52.244911 ignition[979]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 15:59:52.245095 ignition[979]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 15:59:52.246663 ignition[979]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 15:59:52.246928 ignition[979]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 15:59:52.247216 unknown[979]: wrote ssh authorized keys file for user: core Feb 13 15:59:52.247689 ignition[979]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 15:59:52.249643 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 15:59:52.249643 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 15:59:52.292480 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 15:59:52.435683 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 15:59:52.435683 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 15:59:52.436297 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 15:59:52.439002 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-x86-64.raw: attempt #1 Feb 13 15:59:52.909539 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Feb 13 15:59:53.107235 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-x86-64.raw" Feb 13 15:59:53.107235 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 15:59:53.107736 ignition[979]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Feb 13 15:59:53.107736 ignition[979]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Feb 13 15:59:53.688584 ignition[979]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 15:59:53.692179 ignition[979]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 15:59:53.692433 ignition[979]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Feb 13 15:59:53.692433 ignition[979]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Feb 13 15:59:53.692821 ignition[979]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 15:59:53.692821 ignition[979]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 15:59:53.692821 ignition[979]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 15:59:53.692821 ignition[979]: INFO : files: files passed Feb 13 15:59:53.692821 ignition[979]: INFO : Ignition finished successfully Feb 13 15:59:53.694590 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 15:59:53.698437 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 15:59:53.699469 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 15:59:53.705786 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 15:59:53.706022 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 15:59:53.711440 initrd-setup-root-after-ignition[1010]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:59:53.711440 initrd-setup-root-after-ignition[1010]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:59:53.712209 initrd-setup-root-after-ignition[1014]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:59:53.713216 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 15:59:53.713753 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 15:59:53.717392 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 15:59:53.738229 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 15:59:53.738312 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 15:59:53.738670 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 15:59:53.738800 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 15:59:53.739029 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 15:59:53.739671 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 15:59:53.755865 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 15:59:53.758361 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 15:59:53.765440 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:59:53.765621 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:59:53.765907 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 15:59:53.766104 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 15:59:53.766178 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 15:59:53.766541 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 15:59:53.766701 systemd[1]: Stopped target basic.target - Basic System. Feb 13 15:59:53.766901 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 15:59:53.767085 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 15:59:53.767445 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 15:59:53.767653 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 15:59:53.767846 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 15:59:53.768057 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 15:59:53.768268 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 15:59:53.768456 systemd[1]: Stopped target swap.target - Swaps. Feb 13 15:59:53.768614 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 15:59:53.768681 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 15:59:53.768941 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:59:53.769183 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:59:53.769374 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 15:59:53.769419 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:59:53.769571 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 15:59:53.769634 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 15:59:53.769897 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 15:59:53.769963 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 15:59:53.770185 systemd[1]: Stopped target paths.target - Path Units. Feb 13 15:59:53.770330 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 15:59:53.775303 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:59:53.775470 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 15:59:53.775666 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 15:59:53.775854 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 15:59:53.775919 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 15:59:53.776114 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 15:59:53.776157 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 15:59:53.776392 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 15:59:53.776455 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 15:59:53.776682 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 15:59:53.776739 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 15:59:53.784353 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 15:59:53.786892 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 15:59:53.787016 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 15:59:53.787109 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:59:53.787398 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 15:59:53.787481 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 15:59:53.789996 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 15:59:53.790057 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 15:59:53.793654 ignition[1035]: INFO : Ignition 2.20.0 Feb 13 15:59:53.793932 ignition[1035]: INFO : Stage: umount Feb 13 15:59:53.794103 ignition[1035]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:59:53.794103 ignition[1035]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:59:53.795094 ignition[1035]: INFO : umount: umount passed Feb 13 15:59:53.795237 ignition[1035]: INFO : Ignition finished successfully Feb 13 15:59:53.795782 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 15:59:53.795835 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 15:59:53.796470 systemd[1]: Stopped target network.target - Network. Feb 13 15:59:53.796572 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 15:59:53.796605 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 15:59:53.796720 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 15:59:53.796743 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 15:59:53.796861 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 15:59:53.796884 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 15:59:53.797072 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 15:59:53.797093 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 15:59:53.797450 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 15:59:53.797671 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 15:59:53.799040 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 15:59:53.799105 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 15:59:53.800739 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Feb 13 15:59:53.800886 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 15:59:53.800911 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:59:53.801777 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 13 15:59:53.804032 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 15:59:53.804102 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 15:59:53.804917 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Feb 13 15:59:53.805012 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 15:59:53.805030 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:59:53.808329 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 15:59:53.808446 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 15:59:53.808475 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 15:59:53.808817 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Feb 13 15:59:53.808842 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 15:59:53.809609 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 15:59:53.809634 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:59:53.809841 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 15:59:53.809864 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 15:59:53.810006 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:59:53.811912 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 15:59:53.816742 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 15:59:53.816817 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 15:59:53.821650 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 15:59:53.821726 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:59:53.822014 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 15:59:53.822042 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 15:59:53.822259 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 15:59:53.822279 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:59:53.822430 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 15:59:53.822456 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 15:59:53.822872 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 15:59:53.822896 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 15:59:53.823170 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 15:59:53.823192 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:59:53.827368 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 15:59:53.827494 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 15:59:53.827530 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:59:53.828301 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Feb 13 15:59:53.828330 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 15:59:53.828455 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 15:59:53.828478 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:59:53.828596 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:59:53.828620 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:59:53.830068 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Feb 13 15:59:53.830104 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Feb 13 15:59:53.830780 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 15:59:53.830997 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 15:59:54.190360 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 15:59:54.190434 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 15:59:54.190907 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 15:59:54.191098 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 15:59:54.191160 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 15:59:54.198429 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 15:59:54.205333 systemd[1]: Switching root. Feb 13 15:59:54.251550 systemd-journald[216]: Journal stopped Feb 13 15:59:57.041515 systemd-journald[216]: Received SIGTERM from PID 1 (systemd). Feb 13 15:59:57.041545 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 15:59:57.041554 kernel: SELinux: policy capability open_perms=1 Feb 13 15:59:57.041560 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 15:59:57.041566 kernel: SELinux: policy capability always_check_network=0 Feb 13 15:59:57.041571 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 15:59:57.041579 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 15:59:57.041585 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 15:59:57.041591 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 15:59:57.041597 systemd[1]: Successfully loaded SELinux policy in 82.631ms. Feb 13 15:59:57.041605 kernel: audit: type=1403 audit(1739462395.787:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 15:59:57.041611 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 9.643ms. Feb 13 15:59:57.041619 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 15:59:57.041627 systemd[1]: Detected virtualization vmware. Feb 13 15:59:57.041634 systemd[1]: Detected architecture x86-64. Feb 13 15:59:57.041641 systemd[1]: Detected first boot. Feb 13 15:59:57.041648 systemd[1]: Initializing machine ID from random generator. Feb 13 15:59:57.041657 zram_generator::config[1081]: No configuration found. Feb 13 15:59:57.042143 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Feb 13 15:59:57.042158 kernel: Guest personality initialized and is active Feb 13 15:59:57.042165 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Feb 13 15:59:57.042171 kernel: Initialized host personality Feb 13 15:59:57.042177 kernel: NET: Registered PF_VSOCK protocol family Feb 13 15:59:57.042185 systemd[1]: Populated /etc with preset unit settings. Feb 13 15:59:57.042196 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:59:57.042204 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Feb 13 15:59:57.042211 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Feb 13 15:59:57.042218 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 15:59:57.042224 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 15:59:57.042231 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 15:59:57.042240 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 15:59:57.042257 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 15:59:57.042268 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 15:59:57.042276 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 15:59:57.042283 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 15:59:57.042291 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 15:59:57.042298 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 15:59:57.042305 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 15:59:57.042314 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:59:57.042322 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:59:57.042332 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 15:59:57.042339 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 15:59:57.042346 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 15:59:57.042354 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 15:59:57.042361 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 15:59:57.042368 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:59:57.042377 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 15:59:57.042384 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 15:59:57.042391 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 15:59:57.042399 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 15:59:57.042406 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:59:57.042413 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 15:59:57.042421 systemd[1]: Reached target slices.target - Slice Units. Feb 13 15:59:57.042428 systemd[1]: Reached target swap.target - Swaps. Feb 13 15:59:57.042437 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 15:59:57.042444 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 15:59:57.042452 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Feb 13 15:59:57.042459 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:59:57.042466 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 15:59:57.042475 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:59:57.042482 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 15:59:57.042490 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 15:59:57.042497 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 15:59:57.042504 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 15:59:57.042512 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:57.042519 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 15:59:57.042527 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 15:59:57.042536 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 15:59:57.042543 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 15:59:57.042551 systemd[1]: Reached target machines.target - Containers. Feb 13 15:59:57.042558 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 15:59:57.042565 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Feb 13 15:59:57.042573 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 15:59:57.042581 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 15:59:57.042588 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:59:57.042597 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 15:59:57.042605 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 15:59:57.042612 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 15:59:57.042619 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 15:59:57.042627 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 15:59:57.042634 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 15:59:57.042642 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 15:59:57.042649 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 15:59:57.042656 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 15:59:57.042665 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 15:59:57.042673 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 15:59:57.042680 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 15:59:57.042687 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 15:59:57.042695 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 15:59:57.042718 systemd-journald[1167]: Collecting audit messages is disabled. Feb 13 15:59:57.042738 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Feb 13 15:59:57.042746 kernel: fuse: init (API version 7.39) Feb 13 15:59:57.042752 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 15:59:57.042760 systemd-journald[1167]: Journal started Feb 13 15:59:57.042779 systemd-journald[1167]: Runtime Journal (/run/log/journal/0ce119a422ca42609aa7bdf818996e4e) is 4.8M, max 38.6M, 33.8M free. Feb 13 15:59:57.044375 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 15:59:57.044399 systemd[1]: Stopped verity-setup.service. Feb 13 15:59:57.044413 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:56.794718 systemd[1]: Queued start job for default target multi-user.target. Feb 13 15:59:56.801030 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Feb 13 15:59:56.801383 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 15:59:57.046765 jq[1151]: true Feb 13 15:59:57.054271 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 15:59:57.054738 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 15:59:57.054922 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 15:59:57.055087 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 15:59:57.055237 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 15:59:57.055411 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 15:59:57.055567 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 15:59:57.057385 kernel: loop: module loaded Feb 13 15:59:57.059625 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:59:57.059908 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 15:59:57.060007 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 15:59:57.060291 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:59:57.060408 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:59:57.060698 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 15:59:57.060799 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 15:59:57.061053 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 15:59:57.061149 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 15:59:57.061444 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 15:59:57.061536 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 15:59:57.065530 jq[1183]: true Feb 13 15:59:57.072860 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 15:59:57.076887 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 15:59:57.078961 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 15:59:57.081699 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 15:59:57.093000 kernel: ACPI: bus type drm_connector registered Feb 13 15:59:57.090615 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 15:59:57.092314 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 15:59:57.092469 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 15:59:57.092496 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 15:59:57.093350 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Feb 13 15:59:57.096003 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 15:59:57.098095 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 15:59:57.099291 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:59:57.112392 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 15:59:57.113590 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 15:59:57.113736 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 15:59:57.115499 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 15:59:57.115643 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 15:59:57.118776 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 15:59:57.121873 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 15:59:57.124342 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 15:59:57.127371 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 15:59:57.127502 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 15:59:57.127774 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Feb 13 15:59:57.127974 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 15:59:57.128115 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 15:59:57.128595 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 15:59:57.159935 systemd-journald[1167]: Time spent on flushing to /var/log/journal/0ce119a422ca42609aa7bdf818996e4e is 22.987ms for 1852 entries. Feb 13 15:59:57.159935 systemd-journald[1167]: System Journal (/var/log/journal/0ce119a422ca42609aa7bdf818996e4e) is 8M, max 584.8M, 576.8M free. Feb 13 15:59:57.246537 systemd-journald[1167]: Received client request to flush runtime journal. Feb 13 15:59:57.246566 kernel: loop0: detected capacity change from 0 to 138176 Feb 13 15:59:57.218084 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:59:57.223408 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 15:59:57.234067 udevadm[1227]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 15:59:57.249574 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 15:59:57.266926 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 15:59:57.267564 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 15:59:57.276453 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Feb 13 15:59:57.296680 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 15:59:57.300980 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:59:57.328784 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Feb 13 15:59:57.328799 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Feb 13 15:59:57.331937 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 15:59:57.346422 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 15:59:57.357614 ignition[1190]: Ignition 2.20.0 Feb 13 15:59:57.358205 ignition[1190]: deleting config from guestinfo properties Feb 13 15:59:57.375854 ignition[1190]: Successfully deleted config Feb 13 15:59:57.380146 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Feb 13 15:59:57.405037 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Feb 13 15:59:57.500293 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 15:59:57.532314 kernel: loop1: detected capacity change from 0 to 2960 Feb 13 15:59:57.539631 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 15:59:57.555139 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 15:59:57.562629 systemd-tmpfiles[1256]: ACLs are not supported, ignoring. Feb 13 15:59:57.562645 systemd-tmpfiles[1256]: ACLs are not supported, ignoring. Feb 13 15:59:57.565361 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:59:57.618272 kernel: loop2: detected capacity change from 0 to 205544 Feb 13 15:59:57.792267 kernel: loop3: detected capacity change from 0 to 147912 Feb 13 15:59:57.803211 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 15:59:57.908267 kernel: loop4: detected capacity change from 0 to 138176 Feb 13 15:59:58.008501 kernel: loop5: detected capacity change from 0 to 2960 Feb 13 15:59:58.034371 kernel: loop6: detected capacity change from 0 to 205544 Feb 13 15:59:58.182871 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 15:59:58.189419 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:59:58.205923 systemd-udevd[1264]: Using default interface naming scheme 'v255'. Feb 13 15:59:58.233278 kernel: loop7: detected capacity change from 0 to 147912 Feb 13 15:59:58.248612 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:59:58.260773 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 15:59:58.266987 (sd-merge)[1262]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Feb 13 15:59:58.267312 (sd-merge)[1262]: Merged extensions into '/usr'. Feb 13 15:59:58.273443 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 15:59:58.289611 systemd[1]: Reload requested from client PID 1210 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 15:59:58.289621 systemd[1]: Reloading... Feb 13 15:59:58.387275 zram_generator::config[1316]: No configuration found. Feb 13 15:59:58.488457 systemd-networkd[1271]: lo: Link UP Feb 13 15:59:58.488463 systemd-networkd[1271]: lo: Gained carrier Feb 13 15:59:58.488959 systemd-networkd[1271]: Enumeration completed Feb 13 15:59:58.493265 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Feb 13 15:59:58.493554 systemd-networkd[1271]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Feb 13 15:59:58.496540 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 15:59:58.496677 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 15:59:58.495808 systemd-networkd[1271]: ens192: Link UP Feb 13 15:59:58.495893 systemd-networkd[1271]: ens192: Gained carrier Feb 13 15:59:58.502329 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1279) Feb 13 15:59:58.507365 kernel: ACPI: button: Power Button [PWRF] Feb 13 15:59:58.528086 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:59:58.550174 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:59:58.626124 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 15:59:58.626841 systemd[1]: Reloading finished in 336 ms. Feb 13 15:59:58.638891 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 15:59:58.640194 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 15:59:58.640275 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Feb 13 15:59:58.640541 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 15:59:58.655065 ldconfig[1205]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 15:59:58.659276 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Feb 13 15:59:58.659422 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 15:59:58.662070 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 15:59:58.665446 systemd[1]: Starting ensure-sysext.service... Feb 13 15:59:58.667003 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 15:59:58.669446 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Feb 13 15:59:58.671376 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 15:59:58.676033 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 15:59:58.701848 systemd[1]: Reload requested from client PID 1383 ('systemctl') (unit ensure-sysext.service)... Feb 13 15:59:58.701858 systemd[1]: Reloading... Feb 13 15:59:58.702267 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 15:59:58.704726 (udev-worker)[1280]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Feb 13 15:59:58.722616 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 15:59:58.722784 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 15:59:58.723315 systemd-tmpfiles[1387]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 15:59:58.723497 systemd-tmpfiles[1387]: ACLs are not supported, ignoring. Feb 13 15:59:58.723533 systemd-tmpfiles[1387]: ACLs are not supported, ignoring. Feb 13 15:59:58.748272 zram_generator::config[1423]: No configuration found. Feb 13 15:59:58.754399 systemd-tmpfiles[1387]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 15:59:58.754407 systemd-tmpfiles[1387]: Skipping /boot Feb 13 15:59:58.761165 systemd-tmpfiles[1387]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 15:59:58.761173 systemd-tmpfiles[1387]: Skipping /boot Feb 13 15:59:58.819852 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:59:58.840822 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:59:58.899510 systemd[1]: Reloading finished in 197 ms. Feb 13 15:59:58.924016 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 15:59:58.924444 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Feb 13 15:59:58.924832 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:59:58.933538 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 15:59:58.940183 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 15:59:58.943440 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 15:59:58.946897 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 15:59:58.948599 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 15:59:58.950873 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:59:58.956123 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:58.958135 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:59:58.960153 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 15:59:58.962316 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 15:59:58.962470 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:59:58.962551 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 15:59:58.962629 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:58.965169 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 15:59:58.973484 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 15:59:58.983745 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:58.984410 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:59:58.984503 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 15:59:58.984598 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:58.989797 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:58.992633 lvm[1499]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 15:59:58.999451 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 15:59:58.999653 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:59:58.999752 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 15:59:58.999878 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:59:59.002114 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 15:59:59.004783 systemd[1]: Finished ensure-sysext.service. Feb 13 15:59:59.008496 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 15:59:59.010711 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 15:59:59.010840 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 15:59:59.011118 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 15:59:59.011362 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 15:59:59.012112 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 15:59:59.014057 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:59:59.014176 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:59:59.014495 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 15:59:59.014618 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 15:59:59.017702 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 15:59:59.019613 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 15:59:59.028428 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 15:59:59.041915 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 15:59:59.042111 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:59:59.046436 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 15:59:59.046732 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 15:59:59.051221 lvm[1526]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 15:59:59.051579 augenrules[1529]: No rules Feb 13 15:59:59.052526 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 15:59:59.053383 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 15:59:59.063654 systemd-resolved[1487]: Positive Trust Anchors: Feb 13 15:59:59.063662 systemd-resolved[1487]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 15:59:59.063685 systemd-resolved[1487]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 15:59:59.074440 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 15:59:59.074593 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 15:59:59.077975 systemd-resolved[1487]: Defaulting to hostname 'linux'. Feb 13 15:59:59.079830 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 15:59:59.079982 systemd[1]: Reached target network.target - Network. Feb 13 15:59:59.080061 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:59:59.083391 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 15:59:59.133873 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 15:59:59.134474 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 15:59:59.142247 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:59:59.142541 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 15:59:59.142727 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 15:59:59.142878 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 15:59:59.143126 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 15:59:59.143323 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 15:59:59.143457 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 15:59:59.143582 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 15:59:59.143600 systemd[1]: Reached target paths.target - Path Units. Feb 13 15:59:59.143707 systemd[1]: Reached target timers.target - Timer Units. Feb 13 15:59:59.144671 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 15:59:59.146163 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 15:59:59.148191 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Feb 13 15:59:59.148513 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Feb 13 15:59:59.148667 systemd[1]: Reached target ssh-access.target - SSH Access Available. Feb 13 15:59:59.154668 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 15:59:59.155095 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Feb 13 15:59:59.155714 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 15:59:59.155884 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 15:59:59.156008 systemd[1]: Reached target basic.target - Basic System. Feb 13 15:59:59.156149 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 15:59:59.156174 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 15:59:59.157056 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 15:59:59.160366 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 15:59:59.162438 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 15:59:59.163799 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 15:59:59.164207 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 15:59:59.167377 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 15:59:59.170267 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Feb 13 15:59:59.171802 jq[1545]: false Feb 13 15:59:59.172360 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 15:59:59.176365 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 15:59:59.180182 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 15:59:59.181561 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 15:59:59.182133 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 15:59:59.183393 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 15:59:59.185149 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 15:59:59.188518 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Feb 13 15:59:59.190518 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 15:59:59.190698 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 15:59:59.193758 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 15:59:59.194375 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 15:59:59.201007 jq[1555]: true Feb 13 15:59:59.214324 jq[1566]: true Feb 13 15:59:59.218589 extend-filesystems[1546]: Found loop4 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found loop5 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found loop6 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found loop7 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda1 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda2 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda3 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found usr Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda4 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda6 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda7 Feb 13 15:59:59.220779 extend-filesystems[1546]: Found sda9 Feb 13 15:59:59.220779 extend-filesystems[1546]: Checking size of /dev/sda9 Feb 13 15:59:59.221819 (ntainerd)[1568]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 15:59:59.224190 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 15:59:59.224411 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 15:59:59.226852 update_engine[1554]: I20250213 15:59:59.226629 1554 main.cc:92] Flatcar Update Engine starting Feb 13 15:59:59.238350 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Feb 13 15:59:59.242309 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Feb 13 15:59:59.244213 tar[1559]: linux-amd64/helm Feb 13 15:59:59.257443 extend-filesystems[1546]: Old size kept for /dev/sda9 Feb 13 15:59:59.257443 extend-filesystems[1546]: Found sr0 Feb 13 15:59:59.258124 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 15:59:59.258304 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 16:01:13.845970 systemd-resolved[1487]: Clock change detected. Flushing caches. Feb 13 16:01:13.846362 systemd-timesyncd[1513]: Contacted time server 45.33.53.84:123 (0.flatcar.pool.ntp.org). Feb 13 16:01:13.846391 systemd-timesyncd[1513]: Initial clock synchronization to Thu 2025-02-13 16:01:13.845470 UTC. Feb 13 16:01:13.871514 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Feb 13 16:01:13.877457 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1267) Feb 13 16:01:13.885103 bash[1597]: Updated "/home/core/.ssh/authorized_keys" Feb 13 16:01:13.886282 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 16:01:13.887148 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Feb 13 16:01:13.890430 unknown[1593]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Feb 13 16:01:13.892427 systemd-logind[1551]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 16:01:13.892440 systemd-logind[1551]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 16:01:13.893097 unknown[1593]: Core dump limit set to -1 Feb 13 16:01:13.895988 systemd-logind[1551]: New seat seat0. Feb 13 16:01:13.896317 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 16:01:13.904031 dbus-daemon[1544]: [system] SELinux support is enabled Feb 13 16:01:13.904534 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 16:01:13.908358 update_engine[1554]: I20250213 16:01:13.908324 1554 update_check_scheduler.cc:74] Next update check in 11m14s Feb 13 16:01:13.908888 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 16:01:13.909047 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 16:01:13.909723 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 16:01:13.909737 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 16:01:13.909927 systemd[1]: Started update-engine.service - Update Engine. Feb 13 16:01:13.916052 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 16:01:14.062184 locksmithd[1607]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 16:01:14.202149 sshd_keygen[1575]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 16:01:14.216921 containerd[1568]: time="2025-02-13T16:01:14.216859197Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 16:01:14.235067 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 16:01:14.241082 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 16:01:14.242504 containerd[1568]: time="2025-02-13T16:01:14.242482292Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244396846Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244412562Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244421740Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244506952Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244517522Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244561580Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244570220Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244675564Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244683738Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244691097Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245043 containerd[1568]: time="2025-02-13T16:01:14.244696511Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245214 containerd[1568]: time="2025-02-13T16:01:14.244736203Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245214 containerd[1568]: time="2025-02-13T16:01:14.244842881Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245214 containerd[1568]: time="2025-02-13T16:01:14.244930237Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:01:14.245214 containerd[1568]: time="2025-02-13T16:01:14.244939485Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 16:01:14.245214 containerd[1568]: time="2025-02-13T16:01:14.244980764Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 16:01:14.245214 containerd[1568]: time="2025-02-13T16:01:14.245005368Z" level=info msg="metadata content store policy set" policy=shared Feb 13 16:01:14.248552 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 16:01:14.248766 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 16:01:14.254051 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 16:01:14.262348 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 16:01:14.264944 containerd[1568]: time="2025-02-13T16:01:14.264920969Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 16:01:14.265136 containerd[1568]: time="2025-02-13T16:01:14.264958799Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 16:01:14.265136 containerd[1568]: time="2025-02-13T16:01:14.264972517Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 16:01:14.265136 containerd[1568]: time="2025-02-13T16:01:14.264983856Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 16:01:14.265136 containerd[1568]: time="2025-02-13T16:01:14.264992913Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 16:01:14.265136 containerd[1568]: time="2025-02-13T16:01:14.265081591Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265227086Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265298255Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265309659Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265318478Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265326538Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265333623Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265340588Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265347987Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265355852Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265363655Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265370303Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265376661Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265388386Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266520 containerd[1568]: time="2025-02-13T16:01:14.265396404Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265403590Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265410610Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265417542Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265424896Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265431606Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265440850Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265448342Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265456535Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265463647Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265470484Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265477343Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265485232Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265496450Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265503755Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.266710 containerd[1568]: time="2025-02-13T16:01:14.265509246Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265536188Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265554514Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265564957Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265572142Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265577122Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265585421Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265590963Z" level=info msg="NRI interface is disabled by configuration." Feb 13 16:01:14.267152 containerd[1568]: time="2025-02-13T16:01:14.265596750Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.265763199Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.265790380Z" level=info msg="Connect containerd service" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.265810138Z" level=info msg="using legacy CRI server" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.265814231Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.265876861Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266193155Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266281000Z" level=info msg="Start subscribing containerd event" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266306637Z" level=info msg="Start recovering state" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266326589Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266348296Z" level=info msg="Start event monitor" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266354782Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266361845Z" level=info msg="Start snapshots syncer" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266369946Z" level=info msg="Start cni network conf syncer for default" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266374382Z" level=info msg="Start streaming server" Feb 13 16:01:14.267269 containerd[1568]: time="2025-02-13T16:01:14.266401902Z" level=info msg="containerd successfully booted in 0.050590s" Feb 13 16:01:14.269116 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 16:01:14.271089 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 16:01:14.271280 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 16:01:14.271525 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 16:01:14.323243 tar[1559]: linux-amd64/LICENSE Feb 13 16:01:14.323954 tar[1559]: linux-amd64/README.md Feb 13 16:01:14.332574 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Feb 13 16:01:14.541039 systemd-networkd[1271]: ens192: Gained IPv6LL Feb 13 16:01:14.542364 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 16:01:14.543219 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 16:01:14.549170 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Feb 13 16:01:14.558320 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:01:14.561965 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 16:01:14.591636 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 16:01:14.601358 systemd[1]: coreos-metadata.service: Deactivated successfully. Feb 13 16:01:14.601520 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Feb 13 16:01:14.601960 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 16:01:18.029533 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:01:18.030065 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 16:01:18.030960 systemd[1]: Startup finished in 1.008s (kernel) + 8.110s (initrd) + 7.744s (userspace) = 16.863s. Feb 13 16:01:18.043381 (kubelet)[1720]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:01:18.084408 login[1685]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 16:01:18.086432 login[1686]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 16:01:18.092570 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 16:01:18.102143 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 16:01:18.105704 systemd-logind[1551]: New session 1 of user core. Feb 13 16:01:18.109079 systemd-logind[1551]: New session 2 of user core. Feb 13 16:01:18.113122 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 16:01:18.119118 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 16:01:18.121369 (systemd)[1727]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 16:01:18.123170 systemd-logind[1551]: New session c1 of user core. Feb 13 16:01:18.212919 systemd[1727]: Queued start job for default target default.target. Feb 13 16:01:18.226958 systemd[1727]: Created slice app.slice - User Application Slice. Feb 13 16:01:18.226980 systemd[1727]: Reached target paths.target - Paths. Feb 13 16:01:18.227011 systemd[1727]: Reached target timers.target - Timers. Feb 13 16:01:18.227775 systemd[1727]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 16:01:18.237642 systemd[1727]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 16:01:18.237679 systemd[1727]: Reached target sockets.target - Sockets. Feb 13 16:01:18.237709 systemd[1727]: Reached target basic.target - Basic System. Feb 13 16:01:18.237730 systemd[1727]: Reached target default.target - Main User Target. Feb 13 16:01:18.237746 systemd[1727]: Startup finished in 110ms. Feb 13 16:01:18.237833 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 16:01:18.248040 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 16:01:18.248784 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 16:01:18.660833 kubelet[1720]: E0213 16:01:18.660755 1720 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:01:18.662167 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:01:18.662263 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:01:18.662474 systemd[1]: kubelet.service: Consumed 653ms CPU time, 236.9M memory peak. Feb 13 16:01:28.855521 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 16:01:28.865082 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:01:29.191521 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:01:29.194356 (kubelet)[1771]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:01:29.279988 kubelet[1771]: E0213 16:01:29.279961 1771 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:01:29.282464 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:01:29.282558 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:01:29.282753 systemd[1]: kubelet.service: Consumed 84ms CPU time, 98.8M memory peak. Feb 13 16:01:39.355665 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Feb 13 16:01:39.363125 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:01:39.674236 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:01:39.677296 (kubelet)[1786]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:01:39.701537 kubelet[1786]: E0213 16:01:39.701505 1786 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:01:39.702603 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:01:39.702683 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:01:39.702856 systemd[1]: kubelet.service: Consumed 86ms CPU time, 97.3M memory peak. Feb 13 16:01:49.855720 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Feb 13 16:01:49.865087 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:01:50.149426 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:01:50.152735 (kubelet)[1801]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:01:50.175635 kubelet[1801]: E0213 16:01:50.175605 1801 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:01:50.177080 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:01:50.177166 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:01:50.177345 systemd[1]: kubelet.service: Consumed 85ms CPU time, 95.1M memory peak. Feb 13 16:01:54.096663 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 16:01:54.097886 systemd[1]: Started sshd@0-139.178.70.107:22-147.75.109.163:53004.service - OpenSSH per-connection server daemon (147.75.109.163:53004). Feb 13 16:01:54.139834 sshd[1808]: Accepted publickey for core from 147.75.109.163 port 53004 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:01:54.140807 sshd-session[1808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:01:54.143546 systemd-logind[1551]: New session 3 of user core. Feb 13 16:01:54.154001 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 16:01:54.208081 systemd[1]: Started sshd@1-139.178.70.107:22-147.75.109.163:53018.service - OpenSSH per-connection server daemon (147.75.109.163:53018). Feb 13 16:01:54.244593 sshd[1813]: Accepted publickey for core from 147.75.109.163 port 53018 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:01:54.245437 sshd-session[1813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:01:54.248931 systemd-logind[1551]: New session 4 of user core. Feb 13 16:01:54.263036 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 16:01:54.311160 sshd[1815]: Connection closed by 147.75.109.163 port 53018 Feb 13 16:01:54.311921 sshd-session[1813]: pam_unix(sshd:session): session closed for user core Feb 13 16:01:54.321742 systemd[1]: sshd@1-139.178.70.107:22-147.75.109.163:53018.service: Deactivated successfully. Feb 13 16:01:54.322668 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 16:01:54.323462 systemd-logind[1551]: Session 4 logged out. Waiting for processes to exit. Feb 13 16:01:54.327246 systemd[1]: Started sshd@2-139.178.70.107:22-147.75.109.163:53030.service - OpenSSH per-connection server daemon (147.75.109.163:53030). Feb 13 16:01:54.328263 systemd-logind[1551]: Removed session 4. Feb 13 16:01:54.360297 sshd[1820]: Accepted publickey for core from 147.75.109.163 port 53030 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:01:54.361026 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:01:54.363811 systemd-logind[1551]: New session 5 of user core. Feb 13 16:01:54.370050 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 16:01:54.415997 sshd[1823]: Connection closed by 147.75.109.163 port 53030 Feb 13 16:01:54.416365 sshd-session[1820]: pam_unix(sshd:session): session closed for user core Feb 13 16:01:54.425535 systemd[1]: sshd@2-139.178.70.107:22-147.75.109.163:53030.service: Deactivated successfully. Feb 13 16:01:54.427021 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 16:01:54.427538 systemd-logind[1551]: Session 5 logged out. Waiting for processes to exit. Feb 13 16:01:54.428752 systemd[1]: Started sshd@3-139.178.70.107:22-147.75.109.163:53042.service - OpenSSH per-connection server daemon (147.75.109.163:53042). Feb 13 16:01:54.430146 systemd-logind[1551]: Removed session 5. Feb 13 16:01:54.467935 sshd[1828]: Accepted publickey for core from 147.75.109.163 port 53042 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:01:54.468843 sshd-session[1828]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:01:54.472263 systemd-logind[1551]: New session 6 of user core. Feb 13 16:01:54.479026 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 16:01:54.528027 sshd[1831]: Connection closed by 147.75.109.163 port 53042 Feb 13 16:01:54.528418 sshd-session[1828]: pam_unix(sshd:session): session closed for user core Feb 13 16:01:54.542133 systemd[1]: sshd@3-139.178.70.107:22-147.75.109.163:53042.service: Deactivated successfully. Feb 13 16:01:54.543090 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 16:01:54.543932 systemd-logind[1551]: Session 6 logged out. Waiting for processes to exit. Feb 13 16:01:54.546129 systemd[1]: Started sshd@4-139.178.70.107:22-147.75.109.163:53058.service - OpenSSH per-connection server daemon (147.75.109.163:53058). Feb 13 16:01:54.547396 systemd-logind[1551]: Removed session 6. Feb 13 16:01:54.579556 sshd[1836]: Accepted publickey for core from 147.75.109.163 port 53058 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:01:54.580207 sshd-session[1836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:01:54.583918 systemd-logind[1551]: New session 7 of user core. Feb 13 16:01:54.588114 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 16:01:54.651695 sudo[1840]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 16:01:54.651898 sudo[1840]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:01:54.663517 sudo[1840]: pam_unix(sudo:session): session closed for user root Feb 13 16:01:54.665120 sshd[1839]: Connection closed by 147.75.109.163 port 53058 Feb 13 16:01:54.665104 sshd-session[1836]: pam_unix(sshd:session): session closed for user core Feb 13 16:01:54.674194 systemd[1]: sshd@4-139.178.70.107:22-147.75.109.163:53058.service: Deactivated successfully. Feb 13 16:01:54.675220 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 16:01:54.675852 systemd-logind[1551]: Session 7 logged out. Waiting for processes to exit. Feb 13 16:01:54.681133 systemd[1]: Started sshd@5-139.178.70.107:22-147.75.109.163:53068.service - OpenSSH per-connection server daemon (147.75.109.163:53068). Feb 13 16:01:54.682437 systemd-logind[1551]: Removed session 7. Feb 13 16:01:54.718627 sshd[1845]: Accepted publickey for core from 147.75.109.163 port 53068 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:01:54.719523 sshd-session[1845]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:01:54.723228 systemd-logind[1551]: New session 8 of user core. Feb 13 16:01:54.734007 systemd[1]: Started session-8.scope - Session 8 of User core. Feb 13 16:01:54.784019 sudo[1850]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 16:01:54.784224 sudo[1850]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:01:54.786553 sudo[1850]: pam_unix(sudo:session): session closed for user root Feb 13 16:01:54.790010 sudo[1849]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 16:01:54.790186 sudo[1849]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:01:54.805144 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 16:01:54.819548 augenrules[1872]: No rules Feb 13 16:01:54.820183 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 16:01:54.820375 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 16:01:54.821024 sudo[1849]: pam_unix(sudo:session): session closed for user root Feb 13 16:01:54.821872 sshd[1848]: Connection closed by 147.75.109.163 port 53068 Feb 13 16:01:54.821803 sshd-session[1845]: pam_unix(sshd:session): session closed for user core Feb 13 16:01:54.826845 systemd[1]: sshd@5-139.178.70.107:22-147.75.109.163:53068.service: Deactivated successfully. Feb 13 16:01:54.827599 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 16:01:54.828331 systemd-logind[1551]: Session 8 logged out. Waiting for processes to exit. Feb 13 16:01:54.829027 systemd[1]: Started sshd@6-139.178.70.107:22-147.75.109.163:53078.service - OpenSSH per-connection server daemon (147.75.109.163:53078). Feb 13 16:01:54.830163 systemd-logind[1551]: Removed session 8. Feb 13 16:01:54.863571 sshd[1880]: Accepted publickey for core from 147.75.109.163 port 53078 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:01:54.864297 sshd-session[1880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:01:54.866827 systemd-logind[1551]: New session 9 of user core. Feb 13 16:01:54.874998 systemd[1]: Started session-9.scope - Session 9 of User core. Feb 13 16:01:54.922897 sudo[1884]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 16:01:54.923147 sudo[1884]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:01:55.216161 (dockerd)[1901]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Feb 13 16:01:55.216375 systemd[1]: Starting docker.service - Docker Application Container Engine... Feb 13 16:01:55.483978 dockerd[1901]: time="2025-02-13T16:01:55.483894689Z" level=info msg="Starting up" Feb 13 16:01:55.537564 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport91984712-merged.mount: Deactivated successfully. Feb 13 16:01:55.555295 dockerd[1901]: time="2025-02-13T16:01:55.555170421Z" level=info msg="Loading containers: start." Feb 13 16:01:55.646923 kernel: Initializing XFRM netlink socket Feb 13 16:01:55.700409 systemd-networkd[1271]: docker0: Link UP Feb 13 16:01:55.719604 dockerd[1901]: time="2025-02-13T16:01:55.719585501Z" level=info msg="Loading containers: done." Feb 13 16:01:55.727157 dockerd[1901]: time="2025-02-13T16:01:55.727137352Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 16:01:55.727462 dockerd[1901]: time="2025-02-13T16:01:55.727276952Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Feb 13 16:01:55.727462 dockerd[1901]: time="2025-02-13T16:01:55.727335199Z" level=info msg="Daemon has completed initialization" Feb 13 16:01:55.743010 dockerd[1901]: time="2025-02-13T16:01:55.742656364Z" level=info msg="API listen on /run/docker.sock" Feb 13 16:01:55.743010 systemd[1]: Started docker.service - Docker Application Container Engine. Feb 13 16:01:56.645140 containerd[1568]: time="2025-02-13T16:01:56.645098594Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.6\"" Feb 13 16:01:57.248268 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3771495981.mount: Deactivated successfully. Feb 13 16:01:58.231572 containerd[1568]: time="2025-02-13T16:01:58.231541488Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:01:58.232205 containerd[1568]: time="2025-02-13T16:01:58.232190341Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.6: active requests=0, bytes read=27976588" Feb 13 16:01:58.232787 containerd[1568]: time="2025-02-13T16:01:58.232622096Z" level=info msg="ImageCreate event name:\"sha256:1372127edc9da70a68712c470a11f621ed256e8be0dfec4c4d58ca09109352a3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:01:58.234162 containerd[1568]: time="2025-02-13T16:01:58.234141603Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:be0a2d815793b0408d921a50b82759e654cf1bba718cac480498391926902905\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:01:58.234944 containerd[1568]: time="2025-02-13T16:01:58.234752703Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.6\" with image id \"sha256:1372127edc9da70a68712c470a11f621ed256e8be0dfec4c4d58ca09109352a3\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.6\", repo digest \"registry.k8s.io/kube-apiserver@sha256:be0a2d815793b0408d921a50b82759e654cf1bba718cac480498391926902905\", size \"27973388\" in 1.589629971s" Feb 13 16:01:58.234944 containerd[1568]: time="2025-02-13T16:01:58.234773943Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.6\" returns image reference \"sha256:1372127edc9da70a68712c470a11f621ed256e8be0dfec4c4d58ca09109352a3\"" Feb 13 16:01:58.236087 containerd[1568]: time="2025-02-13T16:01:58.236074856Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.6\"" Feb 13 16:01:59.172023 update_engine[1554]: I20250213 16:01:59.171983 1554 update_attempter.cc:509] Updating boot flags... Feb 13 16:01:59.208695 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2157) Feb 13 16:01:59.278039 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2161) Feb 13 16:01:59.669997 containerd[1568]: time="2025-02-13T16:01:59.669963132Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:01:59.674936 containerd[1568]: time="2025-02-13T16:01:59.674824588Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.6: active requests=0, bytes read=24708193" Feb 13 16:01:59.678546 containerd[1568]: time="2025-02-13T16:01:59.678516018Z" level=info msg="ImageCreate event name:\"sha256:5f23cb154eea1f587685082e456e95e5480c1d459849b1c634119d7de897e34e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:01:59.701492 containerd[1568]: time="2025-02-13T16:01:59.701433435Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:63166e537a82855ac9b54ffa8b510429fe799ed9b062bf6b788b74e1d5995d12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:01:59.702264 containerd[1568]: time="2025-02-13T16:01:59.702177828Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.6\" with image id \"sha256:5f23cb154eea1f587685082e456e95e5480c1d459849b1c634119d7de897e34e\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.6\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:63166e537a82855ac9b54ffa8b510429fe799ed9b062bf6b788b74e1d5995d12\", size \"26154739\" in 1.466087984s" Feb 13 16:01:59.702264 containerd[1568]: time="2025-02-13T16:01:59.702197342Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.6\" returns image reference \"sha256:5f23cb154eea1f587685082e456e95e5480c1d459849b1c634119d7de897e34e\"" Feb 13 16:01:59.702709 containerd[1568]: time="2025-02-13T16:01:59.702618494Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.6\"" Feb 13 16:02:00.355518 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Feb 13 16:02:00.363010 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:00.427954 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:00.428449 (kubelet)[2177]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:02:00.478926 kubelet[2177]: E0213 16:02:00.478453 2177 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:02:00.480468 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:02:00.480872 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:02:00.481245 systemd[1]: kubelet.service: Consumed 82ms CPU time, 95.8M memory peak. Feb 13 16:02:00.957218 containerd[1568]: time="2025-02-13T16:02:00.957167464Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:00.967592 containerd[1568]: time="2025-02-13T16:02:00.967551175Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.6: active requests=0, bytes read=18652425" Feb 13 16:02:00.982058 containerd[1568]: time="2025-02-13T16:02:00.982023303Z" level=info msg="ImageCreate event name:\"sha256:9195ad415d31e3c2df6dddf4603bc56915b71486f514455bc3b5389b9b0ed9c1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:00.992370 containerd[1568]: time="2025-02-13T16:02:00.992314926Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:8a64af33c57346355dc3cc6f9225dbe771da30e2f427e802ce2340ec3b5dd9b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:00.993191 containerd[1568]: time="2025-02-13T16:02:00.993050458Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.6\" with image id \"sha256:9195ad415d31e3c2df6dddf4603bc56915b71486f514455bc3b5389b9b0ed9c1\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.6\", repo digest \"registry.k8s.io/kube-scheduler@sha256:8a64af33c57346355dc3cc6f9225dbe771da30e2f427e802ce2340ec3b5dd9b5\", size \"20098989\" in 1.290403183s" Feb 13 16:02:00.993191 containerd[1568]: time="2025-02-13T16:02:00.993077925Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.6\" returns image reference \"sha256:9195ad415d31e3c2df6dddf4603bc56915b71486f514455bc3b5389b9b0ed9c1\"" Feb 13 16:02:00.993794 containerd[1568]: time="2025-02-13T16:02:00.993656987Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.6\"" Feb 13 16:02:02.033059 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2474160350.mount: Deactivated successfully. Feb 13 16:02:02.484567 containerd[1568]: time="2025-02-13T16:02:02.484058094Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:02.491296 containerd[1568]: time="2025-02-13T16:02:02.491271217Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.6: active requests=0, bytes read=30229108" Feb 13 16:02:02.500801 containerd[1568]: time="2025-02-13T16:02:02.500783616Z" level=info msg="ImageCreate event name:\"sha256:d2448f015605e48efb6b06ceaba0cb6d48bfd82e5d30ba357a9bd78c8566348a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:02.508503 containerd[1568]: time="2025-02-13T16:02:02.508485095Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e72a4bc769f10b56ffdfe2cdb21d84d49d9bc194b3658648207998a5bd924b72\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:02.508940 containerd[1568]: time="2025-02-13T16:02:02.508899672Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.6\" with image id \"sha256:d2448f015605e48efb6b06ceaba0cb6d48bfd82e5d30ba357a9bd78c8566348a\", repo tag \"registry.k8s.io/kube-proxy:v1.31.6\", repo digest \"registry.k8s.io/kube-proxy@sha256:e72a4bc769f10b56ffdfe2cdb21d84d49d9bc194b3658648207998a5bd924b72\", size \"30228127\" in 1.515220987s" Feb 13 16:02:02.508940 containerd[1568]: time="2025-02-13T16:02:02.508937620Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.6\" returns image reference \"sha256:d2448f015605e48efb6b06ceaba0cb6d48bfd82e5d30ba357a9bd78c8566348a\"" Feb 13 16:02:02.509399 containerd[1568]: time="2025-02-13T16:02:02.509378797Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Feb 13 16:02:03.981068 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2922122285.mount: Deactivated successfully. Feb 13 16:02:04.668724 containerd[1568]: time="2025-02-13T16:02:04.668453383Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:04.669068 containerd[1568]: time="2025-02-13T16:02:04.669019821Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Feb 13 16:02:04.669551 containerd[1568]: time="2025-02-13T16:02:04.669133173Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:04.671057 containerd[1568]: time="2025-02-13T16:02:04.671041711Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:04.671727 containerd[1568]: time="2025-02-13T16:02:04.671711093Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.162313195s" Feb 13 16:02:04.671756 containerd[1568]: time="2025-02-13T16:02:04.671728325Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Feb 13 16:02:04.672040 containerd[1568]: time="2025-02-13T16:02:04.672007417Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Feb 13 16:02:05.103450 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount315717963.mount: Deactivated successfully. Feb 13 16:02:05.106613 containerd[1568]: time="2025-02-13T16:02:05.106185776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:05.106978 containerd[1568]: time="2025-02-13T16:02:05.106889022Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Feb 13 16:02:05.107845 containerd[1568]: time="2025-02-13T16:02:05.107239059Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:05.112725 containerd[1568]: time="2025-02-13T16:02:05.112712720Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:05.113242 containerd[1568]: time="2025-02-13T16:02:05.113046862Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 441.025636ms" Feb 13 16:02:05.113455 containerd[1568]: time="2025-02-13T16:02:05.113445179Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Feb 13 16:02:05.113772 containerd[1568]: time="2025-02-13T16:02:05.113757030Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Feb 13 16:02:05.612161 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1521339011.mount: Deactivated successfully. Feb 13 16:02:07.067980 containerd[1568]: time="2025-02-13T16:02:07.067946633Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:07.068575 containerd[1568]: time="2025-02-13T16:02:07.068562012Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=56779973" Feb 13 16:02:07.071934 containerd[1568]: time="2025-02-13T16:02:07.071374751Z" level=info msg="ImageCreate event name:\"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:07.072834 containerd[1568]: time="2025-02-13T16:02:07.072808622Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:07.073522 containerd[1568]: time="2025-02-13T16:02:07.073504844Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"56909194\" in 1.959668356s" Feb 13 16:02:07.073553 containerd[1568]: time="2025-02-13T16:02:07.073524301Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:2e96e5913fc06e3d26915af3d0f2ca5048cc4b6327e661e80da792cbf8d8d9d4\"" Feb 13 16:02:08.804411 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:08.804505 systemd[1]: kubelet.service: Consumed 82ms CPU time, 95.8M memory peak. Feb 13 16:02:08.815040 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:08.829311 systemd[1]: Reload requested from client PID 2321 ('systemctl') (unit session-9.scope)... Feb 13 16:02:08.829328 systemd[1]: Reloading... Feb 13 16:02:08.898009 zram_generator::config[2365]: No configuration found. Feb 13 16:02:08.955287 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 16:02:08.973558 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:02:09.037326 systemd[1]: Reloading finished in 207 ms. Feb 13 16:02:09.053707 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Feb 13 16:02:09.053860 systemd[1]: kubelet.service: Failed with result 'signal'. Feb 13 16:02:09.054140 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:09.059061 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:09.348111 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:09.351708 (kubelet)[2433]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 16:02:09.375729 kubelet[2433]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:02:09.375729 kubelet[2433]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 16:02:09.375729 kubelet[2433]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:02:09.379410 kubelet[2433]: I0213 16:02:09.379372 2433 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 16:02:09.771915 kubelet[2433]: I0213 16:02:09.771879 2433 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Feb 13 16:02:09.771915 kubelet[2433]: I0213 16:02:09.771912 2433 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 16:02:09.772111 kubelet[2433]: I0213 16:02:09.772098 2433 server.go:929] "Client rotation is on, will bootstrap in background" Feb 13 16:02:09.964872 kubelet[2433]: I0213 16:02:09.964833 2433 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 16:02:09.974096 kubelet[2433]: E0213 16:02:09.974072 2433 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.107:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:09.996483 kubelet[2433]: E0213 16:02:09.996399 2433 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Feb 13 16:02:09.996483 kubelet[2433]: I0213 16:02:09.996421 2433 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Feb 13 16:02:10.004966 kubelet[2433]: I0213 16:02:10.004813 2433 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 16:02:10.005753 kubelet[2433]: I0213 16:02:10.005734 2433 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 13 16:02:10.005856 kubelet[2433]: I0213 16:02:10.005832 2433 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 16:02:10.005990 kubelet[2433]: I0213 16:02:10.005857 2433 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 13 16:02:10.006075 kubelet[2433]: I0213 16:02:10.005996 2433 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 16:02:10.006075 kubelet[2433]: I0213 16:02:10.006003 2433 container_manager_linux.go:300] "Creating device plugin manager" Feb 13 16:02:10.006116 kubelet[2433]: I0213 16:02:10.006079 2433 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:02:10.007790 kubelet[2433]: I0213 16:02:10.007630 2433 kubelet.go:408] "Attempting to sync node with API server" Feb 13 16:02:10.007790 kubelet[2433]: I0213 16:02:10.007646 2433 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 16:02:10.007790 kubelet[2433]: I0213 16:02:10.007665 2433 kubelet.go:314] "Adding apiserver pod source" Feb 13 16:02:10.007790 kubelet[2433]: I0213 16:02:10.007671 2433 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 16:02:10.010382 kubelet[2433]: W0213 16:02:10.010237 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:10.010382 kubelet[2433]: E0213 16:02:10.010277 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:10.011382 kubelet[2433]: W0213 16:02:10.011313 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:10.011382 kubelet[2433]: E0213 16:02:10.011339 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:10.014468 kubelet[2433]: I0213 16:02:10.014437 2433 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 16:02:10.017478 kubelet[2433]: I0213 16:02:10.017371 2433 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 16:02:10.019453 kubelet[2433]: W0213 16:02:10.019274 2433 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 16:02:10.020589 kubelet[2433]: I0213 16:02:10.019663 2433 server.go:1269] "Started kubelet" Feb 13 16:02:10.020589 kubelet[2433]: I0213 16:02:10.019736 2433 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 16:02:10.020589 kubelet[2433]: I0213 16:02:10.020383 2433 server.go:460] "Adding debug handlers to kubelet server" Feb 13 16:02:10.023090 kubelet[2433]: I0213 16:02:10.022795 2433 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 16:02:10.023090 kubelet[2433]: I0213 16:02:10.022966 2433 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 16:02:10.024094 kubelet[2433]: I0213 16:02:10.023591 2433 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 16:02:10.025868 kubelet[2433]: E0213 16:02:10.023974 2433 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.107:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.107:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1823cff479ec5b74 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-02-13 16:02:10.019646324 +0000 UTC m=+0.665962174,LastTimestamp:2025-02-13 16:02:10.019646324 +0000 UTC m=+0.665962174,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Feb 13 16:02:10.026148 kubelet[2433]: I0213 16:02:10.026135 2433 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Feb 13 16:02:10.027366 kubelet[2433]: I0213 16:02:10.027357 2433 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 13 16:02:10.027542 kubelet[2433]: E0213 16:02:10.027532 2433 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 16:02:10.027983 kubelet[2433]: E0213 16:02:10.027961 2433 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="200ms" Feb 13 16:02:10.027983 kubelet[2433]: I0213 16:02:10.027986 2433 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 13 16:02:10.028439 kubelet[2433]: W0213 16:02:10.028410 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:10.028558 kubelet[2433]: E0213 16:02:10.028439 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:10.028558 kubelet[2433]: I0213 16:02:10.028471 2433 reconciler.go:26] "Reconciler: start to sync state" Feb 13 16:02:10.031747 kubelet[2433]: I0213 16:02:10.031735 2433 factory.go:221] Registration of the containerd container factory successfully Feb 13 16:02:10.032910 kubelet[2433]: I0213 16:02:10.031823 2433 factory.go:221] Registration of the systemd container factory successfully Feb 13 16:02:10.032910 kubelet[2433]: I0213 16:02:10.031876 2433 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 16:02:10.038334 kubelet[2433]: I0213 16:02:10.038302 2433 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 16:02:10.039177 kubelet[2433]: I0213 16:02:10.039163 2433 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 16:02:10.039252 kubelet[2433]: I0213 16:02:10.039247 2433 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 16:02:10.039292 kubelet[2433]: I0213 16:02:10.039288 2433 kubelet.go:2321] "Starting kubelet main sync loop" Feb 13 16:02:10.039350 kubelet[2433]: E0213 16:02:10.039340 2433 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 16:02:10.043822 kubelet[2433]: W0213 16:02:10.043795 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:10.043953 kubelet[2433]: E0213 16:02:10.043942 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:10.044049 kubelet[2433]: E0213 16:02:10.044041 2433 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 16:02:10.055559 kubelet[2433]: I0213 16:02:10.055538 2433 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 16:02:10.055559 kubelet[2433]: I0213 16:02:10.055549 2433 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 16:02:10.055559 kubelet[2433]: I0213 16:02:10.055559 2433 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:02:10.056618 kubelet[2433]: I0213 16:02:10.056603 2433 policy_none.go:49] "None policy: Start" Feb 13 16:02:10.056983 kubelet[2433]: I0213 16:02:10.056972 2433 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 16:02:10.057045 kubelet[2433]: I0213 16:02:10.057025 2433 state_mem.go:35] "Initializing new in-memory state store" Feb 13 16:02:10.060537 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 16:02:10.070642 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 16:02:10.072937 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 16:02:10.082801 kubelet[2433]: I0213 16:02:10.082663 2433 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 16:02:10.082801 kubelet[2433]: I0213 16:02:10.082805 2433 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 13 16:02:10.083034 kubelet[2433]: I0213 16:02:10.082816 2433 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 16:02:10.083071 kubelet[2433]: I0213 16:02:10.083051 2433 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 16:02:10.084726 kubelet[2433]: E0213 16:02:10.084625 2433 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Feb 13 16:02:10.146803 systemd[1]: Created slice kubepods-burstable-pod490d524cde0ce697ff81843f7337cc40.slice - libcontainer container kubepods-burstable-pod490d524cde0ce697ff81843f7337cc40.slice. Feb 13 16:02:10.159282 systemd[1]: Created slice kubepods-burstable-pod04cca2c455deeb5da380812dcab224d8.slice - libcontainer container kubepods-burstable-pod04cca2c455deeb5da380812dcab224d8.slice. Feb 13 16:02:10.165351 systemd[1]: Created slice kubepods-burstable-pod98eb2295280bc6da80e83f7636be329c.slice - libcontainer container kubepods-burstable-pod98eb2295280bc6da80e83f7636be329c.slice. Feb 13 16:02:10.184474 kubelet[2433]: I0213 16:02:10.184454 2433 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Feb 13 16:02:10.184665 kubelet[2433]: E0213 16:02:10.184648 2433 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Feb 13 16:02:10.229222 kubelet[2433]: I0213 16:02:10.229143 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/490d524cde0ce697ff81843f7337cc40-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"490d524cde0ce697ff81843f7337cc40\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:02:10.229222 kubelet[2433]: E0213 16:02:10.229146 2433 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="400ms" Feb 13 16:02:10.229222 kubelet[2433]: I0213 16:02:10.229164 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/490d524cde0ce697ff81843f7337cc40-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"490d524cde0ce697ff81843f7337cc40\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:02:10.229222 kubelet[2433]: I0213 16:02:10.229196 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:10.229222 kubelet[2433]: I0213 16:02:10.229205 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/490d524cde0ce697ff81843f7337cc40-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"490d524cde0ce697ff81843f7337cc40\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:02:10.229480 kubelet[2433]: I0213 16:02:10.229214 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:10.229480 kubelet[2433]: I0213 16:02:10.229222 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:10.229480 kubelet[2433]: I0213 16:02:10.229231 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:10.229480 kubelet[2433]: I0213 16:02:10.229242 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:10.229480 kubelet[2433]: I0213 16:02:10.229250 2433 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/04cca2c455deeb5da380812dcab224d8-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"04cca2c455deeb5da380812dcab224d8\") " pod="kube-system/kube-scheduler-localhost" Feb 13 16:02:10.386615 kubelet[2433]: I0213 16:02:10.386577 2433 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Feb 13 16:02:10.387075 kubelet[2433]: E0213 16:02:10.387040 2433 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Feb 13 16:02:10.459838 containerd[1568]: time="2025-02-13T16:02:10.459812020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:490d524cde0ce697ff81843f7337cc40,Namespace:kube-system,Attempt:0,}" Feb 13 16:02:10.476374 containerd[1568]: time="2025-02-13T16:02:10.476172410Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:04cca2c455deeb5da380812dcab224d8,Namespace:kube-system,Attempt:0,}" Feb 13 16:02:10.476489 containerd[1568]: time="2025-02-13T16:02:10.476477844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:98eb2295280bc6da80e83f7636be329c,Namespace:kube-system,Attempt:0,}" Feb 13 16:02:10.629619 kubelet[2433]: E0213 16:02:10.629591 2433 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="800ms" Feb 13 16:02:10.788399 kubelet[2433]: I0213 16:02:10.788133 2433 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Feb 13 16:02:10.788462 kubelet[2433]: E0213 16:02:10.788356 2433 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Feb 13 16:02:10.821832 kubelet[2433]: W0213 16:02:10.821770 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:10.821832 kubelet[2433]: E0213 16:02:10.821815 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:10.964077 kubelet[2433]: W0213 16:02:10.964014 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:10.964077 kubelet[2433]: E0213 16:02:10.964058 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:11.001594 kubelet[2433]: W0213 16:02:11.001534 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:11.001594 kubelet[2433]: E0213 16:02:11.001574 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:11.264951 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3844365711.mount: Deactivated successfully. Feb 13 16:02:11.302924 containerd[1568]: time="2025-02-13T16:02:11.302799200Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:02:11.307923 containerd[1568]: time="2025-02-13T16:02:11.307896329Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 16:02:11.328497 containerd[1568]: time="2025-02-13T16:02:11.328471489Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:02:11.342110 containerd[1568]: time="2025-02-13T16:02:11.342048780Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:02:11.348013 containerd[1568]: time="2025-02-13T16:02:11.347544029Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:02:11.348242 containerd[1568]: time="2025-02-13T16:02:11.348218085Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 888.33887ms" Feb 13 16:02:11.354597 containerd[1568]: time="2025-02-13T16:02:11.354539666Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 16:02:11.359173 containerd[1568]: time="2025-02-13T16:02:11.359078528Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:02:11.365405 containerd[1568]: time="2025-02-13T16:02:11.365379550Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 16:02:11.373608 containerd[1568]: time="2025-02-13T16:02:11.373583155Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 897.039269ms" Feb 13 16:02:11.392647 containerd[1568]: time="2025-02-13T16:02:11.392552669Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 916.318906ms" Feb 13 16:02:11.430363 kubelet[2433]: E0213 16:02:11.430328 2433 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="1.6s" Feb 13 16:02:11.675181 kubelet[2433]: W0213 16:02:11.492270 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:11.675181 kubelet[2433]: E0213 16:02:11.492319 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:11.675181 kubelet[2433]: I0213 16:02:11.589519 2433 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Feb 13 16:02:11.675181 kubelet[2433]: E0213 16:02:11.589745 2433 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Feb 13 16:02:11.794114 containerd[1568]: time="2025-02-13T16:02:11.792148847Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:11.795126 containerd[1568]: time="2025-02-13T16:02:11.794126651Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:11.795126 containerd[1568]: time="2025-02-13T16:02:11.794156931Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:11.795126 containerd[1568]: time="2025-02-13T16:02:11.794250393Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:11.795357 containerd[1568]: time="2025-02-13T16:02:11.795309208Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:11.795431 containerd[1568]: time="2025-02-13T16:02:11.795344572Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:11.795431 containerd[1568]: time="2025-02-13T16:02:11.795409650Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:11.795747 containerd[1568]: time="2025-02-13T16:02:11.795514344Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:11.798124 containerd[1568]: time="2025-02-13T16:02:11.797823029Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:11.798124 containerd[1568]: time="2025-02-13T16:02:11.797955214Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:11.798124 containerd[1568]: time="2025-02-13T16:02:11.797965711Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:11.798124 containerd[1568]: time="2025-02-13T16:02:11.798102557Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:11.814049 systemd[1]: Started cri-containerd-4fa6ba4c2deeaa13e1c3558b16293ba3bd4bc7a1badf0e09445db858f75147d1.scope - libcontainer container 4fa6ba4c2deeaa13e1c3558b16293ba3bd4bc7a1badf0e09445db858f75147d1. Feb 13 16:02:11.817359 systemd[1]: Started cri-containerd-25d3269a82c7fe0662f61e7d23cbae55ac4c61211deb2559958e3897cd05eb83.scope - libcontainer container 25d3269a82c7fe0662f61e7d23cbae55ac4c61211deb2559958e3897cd05eb83. Feb 13 16:02:11.818538 systemd[1]: Started cri-containerd-4801e38f9b48da919c10bb78c6055dd4494083202eaab64d74720a154e42fb0a.scope - libcontainer container 4801e38f9b48da919c10bb78c6055dd4494083202eaab64d74720a154e42fb0a. Feb 13 16:02:11.865181 containerd[1568]: time="2025-02-13T16:02:11.865159262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:98eb2295280bc6da80e83f7636be329c,Namespace:kube-system,Attempt:0,} returns sandbox id \"4801e38f9b48da919c10bb78c6055dd4494083202eaab64d74720a154e42fb0a\"" Feb 13 16:02:12.006076 containerd[1568]: time="2025-02-13T16:02:11.866607980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:04cca2c455deeb5da380812dcab224d8,Namespace:kube-system,Attempt:0,} returns sandbox id \"4fa6ba4c2deeaa13e1c3558b16293ba3bd4bc7a1badf0e09445db858f75147d1\"" Feb 13 16:02:12.006076 containerd[1568]: time="2025-02-13T16:02:11.868746256Z" level=info msg="CreateContainer within sandbox \"4801e38f9b48da919c10bb78c6055dd4494083202eaab64d74720a154e42fb0a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 16:02:12.006076 containerd[1568]: time="2025-02-13T16:02:11.869325767Z" level=info msg="CreateContainer within sandbox \"4fa6ba4c2deeaa13e1c3558b16293ba3bd4bc7a1badf0e09445db858f75147d1\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 16:02:12.006076 containerd[1568]: time="2025-02-13T16:02:11.869780709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:490d524cde0ce697ff81843f7337cc40,Namespace:kube-system,Attempt:0,} returns sandbox id \"25d3269a82c7fe0662f61e7d23cbae55ac4c61211deb2559958e3897cd05eb83\"" Feb 13 16:02:12.006076 containerd[1568]: time="2025-02-13T16:02:11.871760459Z" level=info msg="CreateContainer within sandbox \"25d3269a82c7fe0662f61e7d23cbae55ac4c61211deb2559958e3897cd05eb83\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 16:02:12.006276 kubelet[2433]: E0213 16:02:11.999304 2433 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.107:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:12.400501 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3919467263.mount: Deactivated successfully. Feb 13 16:02:12.408725 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3418155761.mount: Deactivated successfully. Feb 13 16:02:12.497539 containerd[1568]: time="2025-02-13T16:02:12.497414152Z" level=info msg="CreateContainer within sandbox \"25d3269a82c7fe0662f61e7d23cbae55ac4c61211deb2559958e3897cd05eb83\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d3e2fe2b7f7c45afc1db71386471592acdf82d5673b49f1ea907211bc39a6639\"" Feb 13 16:02:12.498212 containerd[1568]: time="2025-02-13T16:02:12.498077350Z" level=info msg="StartContainer for \"d3e2fe2b7f7c45afc1db71386471592acdf82d5673b49f1ea907211bc39a6639\"" Feb 13 16:02:12.519055 systemd[1]: Started cri-containerd-d3e2fe2b7f7c45afc1db71386471592acdf82d5673b49f1ea907211bc39a6639.scope - libcontainer container d3e2fe2b7f7c45afc1db71386471592acdf82d5673b49f1ea907211bc39a6639. Feb 13 16:02:12.580566 containerd[1568]: time="2025-02-13T16:02:12.580537915Z" level=info msg="StartContainer for \"d3e2fe2b7f7c45afc1db71386471592acdf82d5673b49f1ea907211bc39a6639\" returns successfully" Feb 13 16:02:12.705643 containerd[1568]: time="2025-02-13T16:02:12.705455197Z" level=info msg="CreateContainer within sandbox \"4fa6ba4c2deeaa13e1c3558b16293ba3bd4bc7a1badf0e09445db858f75147d1\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"2742f8e6690a301bd71e2514f37fbb911b513eef7dba83576fce3de8c32ece86\"" Feb 13 16:02:12.709732 containerd[1568]: time="2025-02-13T16:02:12.705859687Z" level=info msg="StartContainer for \"2742f8e6690a301bd71e2514f37fbb911b513eef7dba83576fce3de8c32ece86\"" Feb 13 16:02:12.723300 containerd[1568]: time="2025-02-13T16:02:12.723201470Z" level=info msg="CreateContainer within sandbox \"4801e38f9b48da919c10bb78c6055dd4494083202eaab64d74720a154e42fb0a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"03a543d87094e986a079d6769c8f3b16303734284cc311116cf08e19ff4fb0fb\"" Feb 13 16:02:12.723927 containerd[1568]: time="2025-02-13T16:02:12.723777059Z" level=info msg="StartContainer for \"03a543d87094e986a079d6769c8f3b16303734284cc311116cf08e19ff4fb0fb\"" Feb 13 16:02:12.724082 systemd[1]: Started cri-containerd-2742f8e6690a301bd71e2514f37fbb911b513eef7dba83576fce3de8c32ece86.scope - libcontainer container 2742f8e6690a301bd71e2514f37fbb911b513eef7dba83576fce3de8c32ece86. Feb 13 16:02:12.746015 systemd[1]: Started cri-containerd-03a543d87094e986a079d6769c8f3b16303734284cc311116cf08e19ff4fb0fb.scope - libcontainer container 03a543d87094e986a079d6769c8f3b16303734284cc311116cf08e19ff4fb0fb. Feb 13 16:02:12.762295 containerd[1568]: time="2025-02-13T16:02:12.762054902Z" level=info msg="StartContainer for \"2742f8e6690a301bd71e2514f37fbb911b513eef7dba83576fce3de8c32ece86\" returns successfully" Feb 13 16:02:12.785462 containerd[1568]: time="2025-02-13T16:02:12.785437120Z" level=info msg="StartContainer for \"03a543d87094e986a079d6769c8f3b16303734284cc311116cf08e19ff4fb0fb\" returns successfully" Feb 13 16:02:13.024484 kubelet[2433]: W0213 16:02:13.024391 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:13.024484 kubelet[2433]: E0213 16:02:13.024418 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.107:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:13.031269 kubelet[2433]: E0213 16:02:13.031249 2433 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="3.2s" Feb 13 16:02:13.064526 kubelet[2433]: W0213 16:02:13.064468 2433 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Feb 13 16:02:13.064526 kubelet[2433]: E0213 16:02:13.064495 2433 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.107:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:02:13.191291 kubelet[2433]: I0213 16:02:13.191270 2433 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Feb 13 16:02:13.191505 kubelet[2433]: E0213 16:02:13.191473 2433 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Feb 13 16:02:14.617977 kubelet[2433]: E0213 16:02:14.617938 2433 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Feb 13 16:02:14.965295 kubelet[2433]: E0213 16:02:14.965227 2433 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Feb 13 16:02:15.013619 kubelet[2433]: I0213 16:02:15.013544 2433 apiserver.go:52] "Watching apiserver" Feb 13 16:02:15.028384 kubelet[2433]: I0213 16:02:15.028356 2433 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 13 16:02:15.394869 kubelet[2433]: E0213 16:02:15.394844 2433 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Feb 13 16:02:16.234359 kubelet[2433]: E0213 16:02:16.234335 2433 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Feb 13 16:02:16.303016 kubelet[2433]: E0213 16:02:16.302992 2433 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Feb 13 16:02:16.399693 kubelet[2433]: I0213 16:02:16.399629 2433 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Feb 13 16:02:16.405937 kubelet[2433]: I0213 16:02:16.405900 2433 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Feb 13 16:02:16.629987 systemd[1]: Reload requested from client PID 2713 ('systemctl') (unit session-9.scope)... Feb 13 16:02:16.630001 systemd[1]: Reloading... Feb 13 16:02:16.692939 zram_generator::config[2761]: No configuration found. Feb 13 16:02:16.757129 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 16:02:16.775317 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:02:16.850090 systemd[1]: Reloading finished in 219 ms. Feb 13 16:02:16.868558 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:16.876155 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 16:02:16.876306 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:16.876342 systemd[1]: kubelet.service: Consumed 596ms CPU time, 115.3M memory peak. Feb 13 16:02:16.882130 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:17.440318 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:17.449344 (kubelet)[2825]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 16:02:17.509424 kubelet[2825]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:02:17.509835 kubelet[2825]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 16:02:17.509835 kubelet[2825]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:02:17.520528 kubelet[2825]: I0213 16:02:17.520484 2825 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 16:02:17.526419 kubelet[2825]: I0213 16:02:17.526347 2825 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" Feb 13 16:02:17.526419 kubelet[2825]: I0213 16:02:17.526415 2825 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 16:02:17.526570 kubelet[2825]: I0213 16:02:17.526558 2825 server.go:929] "Client rotation is on, will bootstrap in background" Feb 13 16:02:17.527350 kubelet[2825]: I0213 16:02:17.527339 2825 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 16:02:17.545337 kubelet[2825]: I0213 16:02:17.545021 2825 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 16:02:17.546694 kubelet[2825]: E0213 16:02:17.546675 2825 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Feb 13 16:02:17.546694 kubelet[2825]: I0213 16:02:17.546693 2825 server.go:1403] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Feb 13 16:02:17.549969 kubelet[2825]: I0213 16:02:17.548375 2825 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 16:02:17.549969 kubelet[2825]: I0213 16:02:17.548437 2825 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 13 16:02:17.549969 kubelet[2825]: I0213 16:02:17.548506 2825 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 16:02:17.549969 kubelet[2825]: I0213 16:02:17.548518 2825 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.548674 2825 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.548682 2825 container_manager_linux.go:300] "Creating device plugin manager" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.548712 2825 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.548765 2825 kubelet.go:408] "Attempting to sync node with API server" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.548771 2825 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.548788 2825 kubelet.go:314] "Adding apiserver pod source" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.548799 2825 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.549376 2825 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 16:02:17.550111 kubelet[2825]: I0213 16:02:17.549775 2825 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 16:02:17.550312 kubelet[2825]: I0213 16:02:17.550302 2825 server.go:1269] "Started kubelet" Feb 13 16:02:17.562914 kubelet[2825]: I0213 16:02:17.562648 2825 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 16:02:17.564178 kubelet[2825]: I0213 16:02:17.564147 2825 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 16:02:17.564414 kubelet[2825]: I0213 16:02:17.564407 2825 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 16:02:17.566147 kubelet[2825]: I0213 16:02:17.566135 2825 server.go:460] "Adding debug handlers to kubelet server" Feb 13 16:02:17.567151 kubelet[2825]: I0213 16:02:17.567139 2825 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 16:02:17.568095 kubelet[2825]: E0213 16:02:17.568071 2825 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 16:02:17.568710 kubelet[2825]: I0213 16:02:17.568701 2825 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Feb 13 16:02:17.581746 kubelet[2825]: I0213 16:02:17.581726 2825 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 13 16:02:17.581924 kubelet[2825]: E0213 16:02:17.581892 2825 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 16:02:17.582569 kubelet[2825]: I0213 16:02:17.582525 2825 factory.go:221] Registration of the systemd container factory successfully Feb 13 16:02:17.582786 kubelet[2825]: I0213 16:02:17.582677 2825 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 16:02:17.585555 kubelet[2825]: I0213 16:02:17.585535 2825 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 13 16:02:17.586115 kubelet[2825]: I0213 16:02:17.585706 2825 reconciler.go:26] "Reconciler: start to sync state" Feb 13 16:02:17.587445 kubelet[2825]: I0213 16:02:17.587425 2825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 16:02:17.589705 kubelet[2825]: I0213 16:02:17.589668 2825 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 16:02:17.589772 kubelet[2825]: I0213 16:02:17.589718 2825 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 16:02:17.589772 kubelet[2825]: I0213 16:02:17.589732 2825 kubelet.go:2321] "Starting kubelet main sync loop" Feb 13 16:02:17.589772 kubelet[2825]: E0213 16:02:17.589757 2825 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 16:02:17.591139 kubelet[2825]: I0213 16:02:17.591128 2825 factory.go:221] Registration of the containerd container factory successfully Feb 13 16:02:17.637808 kubelet[2825]: I0213 16:02:17.637652 2825 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 16:02:17.637808 kubelet[2825]: I0213 16:02:17.637662 2825 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 16:02:17.637808 kubelet[2825]: I0213 16:02:17.637673 2825 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:02:17.637808 kubelet[2825]: I0213 16:02:17.637758 2825 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 16:02:17.637808 kubelet[2825]: I0213 16:02:17.637764 2825 state_mem.go:96] "Updated CPUSet assignments" assignments={} Feb 13 16:02:17.637808 kubelet[2825]: I0213 16:02:17.637776 2825 policy_none.go:49] "None policy: Start" Feb 13 16:02:17.638462 kubelet[2825]: I0213 16:02:17.638293 2825 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 16:02:17.638462 kubelet[2825]: I0213 16:02:17.638305 2825 state_mem.go:35] "Initializing new in-memory state store" Feb 13 16:02:17.638462 kubelet[2825]: I0213 16:02:17.638427 2825 state_mem.go:75] "Updated machine memory state" Feb 13 16:02:17.641079 kubelet[2825]: I0213 16:02:17.641071 2825 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 16:02:17.641248 kubelet[2825]: I0213 16:02:17.641241 2825 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 13 16:02:17.641352 kubelet[2825]: I0213 16:02:17.641289 2825 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 16:02:17.641589 kubelet[2825]: I0213 16:02:17.641456 2825 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 16:02:17.743327 kubelet[2825]: I0213 16:02:17.743233 2825 kubelet_node_status.go:72] "Attempting to register node" node="localhost" Feb 13 16:02:17.748052 kubelet[2825]: I0213 16:02:17.747835 2825 kubelet_node_status.go:111] "Node was previously registered" node="localhost" Feb 13 16:02:17.748441 kubelet[2825]: I0213 16:02:17.748424 2825 kubelet_node_status.go:75] "Successfully registered node" node="localhost" Feb 13 16:02:17.786727 kubelet[2825]: I0213 16:02:17.786701 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/04cca2c455deeb5da380812dcab224d8-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"04cca2c455deeb5da380812dcab224d8\") " pod="kube-system/kube-scheduler-localhost" Feb 13 16:02:17.786727 kubelet[2825]: I0213 16:02:17.786725 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:17.786839 kubelet[2825]: I0213 16:02:17.786736 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:17.786839 kubelet[2825]: I0213 16:02:17.786747 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:17.786839 kubelet[2825]: I0213 16:02:17.786757 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:17.786839 kubelet[2825]: I0213 16:02:17.786767 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/98eb2295280bc6da80e83f7636be329c-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"98eb2295280bc6da80e83f7636be329c\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:02:17.786839 kubelet[2825]: I0213 16:02:17.786775 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/490d524cde0ce697ff81843f7337cc40-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"490d524cde0ce697ff81843f7337cc40\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:02:17.787782 kubelet[2825]: I0213 16:02:17.786787 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/490d524cde0ce697ff81843f7337cc40-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"490d524cde0ce697ff81843f7337cc40\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:02:17.787782 kubelet[2825]: I0213 16:02:17.786796 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/490d524cde0ce697ff81843f7337cc40-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"490d524cde0ce697ff81843f7337cc40\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:02:18.559267 kubelet[2825]: I0213 16:02:18.559243 2825 apiserver.go:52] "Watching apiserver" Feb 13 16:02:18.585825 kubelet[2825]: I0213 16:02:18.585789 2825 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 13 16:02:18.650433 kubelet[2825]: I0213 16:02:18.650311 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.6502954380000001 podStartE2EDuration="1.650295438s" podCreationTimestamp="2025-02-13 16:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:02:18.65011828 +0000 UTC m=+1.187472697" watchObservedRunningTime="2025-02-13 16:02:18.650295438 +0000 UTC m=+1.187649851" Feb 13 16:02:18.687323 kubelet[2825]: I0213 16:02:18.687267 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.687253066 podStartE2EDuration="1.687253066s" podCreationTimestamp="2025-02-13 16:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:02:18.664828791 +0000 UTC m=+1.202183200" watchObservedRunningTime="2025-02-13 16:02:18.687253066 +0000 UTC m=+1.224607483" Feb 13 16:02:18.745814 kubelet[2825]: I0213 16:02:18.745254 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.745240576 podStartE2EDuration="1.745240576s" podCreationTimestamp="2025-02-13 16:02:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:02:18.689525383 +0000 UTC m=+1.226879800" watchObservedRunningTime="2025-02-13 16:02:18.745240576 +0000 UTC m=+1.282594985" Feb 13 16:02:21.224850 kubelet[2825]: I0213 16:02:21.224827 2825 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 16:02:21.225312 containerd[1568]: time="2025-02-13T16:02:21.225068122Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 16:02:21.225981 kubelet[2825]: I0213 16:02:21.225583 2825 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 16:02:21.922531 sudo[1884]: pam_unix(sudo:session): session closed for user root Feb 13 16:02:21.923210 sshd[1883]: Connection closed by 147.75.109.163 port 53078 Feb 13 16:02:21.923836 sshd-session[1880]: pam_unix(sshd:session): session closed for user core Feb 13 16:02:21.925855 systemd[1]: sshd@6-139.178.70.107:22-147.75.109.163:53078.service: Deactivated successfully. Feb 13 16:02:21.927063 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 16:02:21.927194 systemd[1]: session-9.scope: Consumed 2.750s CPU time, 150.4M memory peak. Feb 13 16:02:21.928089 systemd-logind[1551]: Session 9 logged out. Waiting for processes to exit. Feb 13 16:02:21.928838 systemd-logind[1551]: Removed session 9. Feb 13 16:02:22.132682 systemd[1]: Created slice kubepods-besteffort-pod7a59b02f_ae51_4261_8191_5f078fbb5b47.slice - libcontainer container kubepods-besteffort-pod7a59b02f_ae51_4261_8191_5f078fbb5b47.slice. Feb 13 16:02:22.216085 kubelet[2825]: I0213 16:02:22.216019 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xxmd\" (UniqueName: \"kubernetes.io/projected/7a59b02f-ae51-4261-8191-5f078fbb5b47-kube-api-access-5xxmd\") pod \"kube-proxy-5z2bq\" (UID: \"7a59b02f-ae51-4261-8191-5f078fbb5b47\") " pod="kube-system/kube-proxy-5z2bq" Feb 13 16:02:22.216085 kubelet[2825]: I0213 16:02:22.216054 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7a59b02f-ae51-4261-8191-5f078fbb5b47-lib-modules\") pod \"kube-proxy-5z2bq\" (UID: \"7a59b02f-ae51-4261-8191-5f078fbb5b47\") " pod="kube-system/kube-proxy-5z2bq" Feb 13 16:02:22.216186 kubelet[2825]: I0213 16:02:22.216072 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7a59b02f-ae51-4261-8191-5f078fbb5b47-kube-proxy\") pod \"kube-proxy-5z2bq\" (UID: \"7a59b02f-ae51-4261-8191-5f078fbb5b47\") " pod="kube-system/kube-proxy-5z2bq" Feb 13 16:02:22.216186 kubelet[2825]: I0213 16:02:22.216108 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7a59b02f-ae51-4261-8191-5f078fbb5b47-xtables-lock\") pod \"kube-proxy-5z2bq\" (UID: \"7a59b02f-ae51-4261-8191-5f078fbb5b47\") " pod="kube-system/kube-proxy-5z2bq" Feb 13 16:02:22.221847 systemd[1]: Created slice kubepods-besteffort-pod4dcd6d73_9936_47d8_bd53_585609ccc1b2.slice - libcontainer container kubepods-besteffort-pod4dcd6d73_9936_47d8_bd53_585609ccc1b2.slice. Feb 13 16:02:22.316651 kubelet[2825]: I0213 16:02:22.316607 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/4dcd6d73-9936-47d8-bd53-585609ccc1b2-var-lib-calico\") pod \"tigera-operator-76c4976dd7-66hqp\" (UID: \"4dcd6d73-9936-47d8-bd53-585609ccc1b2\") " pod="tigera-operator/tigera-operator-76c4976dd7-66hqp" Feb 13 16:02:22.316892 kubelet[2825]: I0213 16:02:22.316662 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp9vr\" (UniqueName: \"kubernetes.io/projected/4dcd6d73-9936-47d8-bd53-585609ccc1b2-kube-api-access-hp9vr\") pod \"tigera-operator-76c4976dd7-66hqp\" (UID: \"4dcd6d73-9936-47d8-bd53-585609ccc1b2\") " pod="tigera-operator/tigera-operator-76c4976dd7-66hqp" Feb 13 16:02:22.439820 containerd[1568]: time="2025-02-13T16:02:22.439678226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5z2bq,Uid:7a59b02f-ae51-4261-8191-5f078fbb5b47,Namespace:kube-system,Attempt:0,}" Feb 13 16:02:22.456892 containerd[1568]: time="2025-02-13T16:02:22.456705363Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:22.456892 containerd[1568]: time="2025-02-13T16:02:22.456747724Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:22.456892 containerd[1568]: time="2025-02-13T16:02:22.456755449Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:22.456892 containerd[1568]: time="2025-02-13T16:02:22.456805669Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:22.478120 systemd[1]: Started cri-containerd-57f2b6d13c6d09d5993f83c39a078819c948618637f7483aa9a8973c5a57f353.scope - libcontainer container 57f2b6d13c6d09d5993f83c39a078819c948618637f7483aa9a8973c5a57f353. Feb 13 16:02:22.497047 containerd[1568]: time="2025-02-13T16:02:22.496988911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-5z2bq,Uid:7a59b02f-ae51-4261-8191-5f078fbb5b47,Namespace:kube-system,Attempt:0,} returns sandbox id \"57f2b6d13c6d09d5993f83c39a078819c948618637f7483aa9a8973c5a57f353\"" Feb 13 16:02:22.499608 containerd[1568]: time="2025-02-13T16:02:22.499452125Z" level=info msg="CreateContainer within sandbox \"57f2b6d13c6d09d5993f83c39a078819c948618637f7483aa9a8973c5a57f353\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 16:02:22.508374 containerd[1568]: time="2025-02-13T16:02:22.508337078Z" level=info msg="CreateContainer within sandbox \"57f2b6d13c6d09d5993f83c39a078819c948618637f7483aa9a8973c5a57f353\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"50001782964033c23359f2b6145449301613f4dc7d8bff2ae58e727f70a5819d\"" Feb 13 16:02:22.508941 containerd[1568]: time="2025-02-13T16:02:22.508920905Z" level=info msg="StartContainer for \"50001782964033c23359f2b6145449301613f4dc7d8bff2ae58e727f70a5819d\"" Feb 13 16:02:22.524669 containerd[1568]: time="2025-02-13T16:02:22.524619683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-66hqp,Uid:4dcd6d73-9936-47d8-bd53-585609ccc1b2,Namespace:tigera-operator,Attempt:0,}" Feb 13 16:02:22.533118 systemd[1]: Started cri-containerd-50001782964033c23359f2b6145449301613f4dc7d8bff2ae58e727f70a5819d.scope - libcontainer container 50001782964033c23359f2b6145449301613f4dc7d8bff2ae58e727f70a5819d. Feb 13 16:02:22.544869 containerd[1568]: time="2025-02-13T16:02:22.544703454Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:22.545368 containerd[1568]: time="2025-02-13T16:02:22.545093154Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:22.545368 containerd[1568]: time="2025-02-13T16:02:22.545114575Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:22.546147 containerd[1568]: time="2025-02-13T16:02:22.546109848Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:22.563862 containerd[1568]: time="2025-02-13T16:02:22.563685330Z" level=info msg="StartContainer for \"50001782964033c23359f2b6145449301613f4dc7d8bff2ae58e727f70a5819d\" returns successfully" Feb 13 16:02:22.564045 systemd[1]: Started cri-containerd-236daf699aa6151975ee19c89f0cfb377a9b19b5200ab2f28ab1db3177f24871.scope - libcontainer container 236daf699aa6151975ee19c89f0cfb377a9b19b5200ab2f28ab1db3177f24871. Feb 13 16:02:22.597458 containerd[1568]: time="2025-02-13T16:02:22.597436091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-76c4976dd7-66hqp,Uid:4dcd6d73-9936-47d8-bd53-585609ccc1b2,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"236daf699aa6151975ee19c89f0cfb377a9b19b5200ab2f28ab1db3177f24871\"" Feb 13 16:02:22.599244 containerd[1568]: time="2025-02-13T16:02:22.599222322Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Feb 13 16:02:22.635641 kubelet[2825]: I0213 16:02:22.635602 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-5z2bq" podStartSLOduration=0.635589151 podStartE2EDuration="635.589151ms" podCreationTimestamp="2025-02-13 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:02:22.635409891 +0000 UTC m=+5.172764307" watchObservedRunningTime="2025-02-13 16:02:22.635589151 +0000 UTC m=+5.172943561" Feb 13 16:02:25.563839 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1067414706.mount: Deactivated successfully. Feb 13 16:02:25.944369 containerd[1568]: time="2025-02-13T16:02:25.943784673Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:25.944369 containerd[1568]: time="2025-02-13T16:02:25.944331309Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Feb 13 16:02:25.946445 containerd[1568]: time="2025-02-13T16:02:25.946428235Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:25.952345 containerd[1568]: time="2025-02-13T16:02:25.952331974Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:25.952644 containerd[1568]: time="2025-02-13T16:02:25.952626644Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 3.353314183s" Feb 13 16:02:25.952673 containerd[1568]: time="2025-02-13T16:02:25.952645635Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Feb 13 16:02:25.954432 containerd[1568]: time="2025-02-13T16:02:25.954072501Z" level=info msg="CreateContainer within sandbox \"236daf699aa6151975ee19c89f0cfb377a9b19b5200ab2f28ab1db3177f24871\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 16:02:25.995448 containerd[1568]: time="2025-02-13T16:02:25.995421317Z" level=info msg="CreateContainer within sandbox \"236daf699aa6151975ee19c89f0cfb377a9b19b5200ab2f28ab1db3177f24871\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"19a730c978c3ef4023af0797681afc12d1379bfaf4d98a064837a9a2f70ee069\"" Feb 13 16:02:25.996352 containerd[1568]: time="2025-02-13T16:02:25.996333938Z" level=info msg="StartContainer for \"19a730c978c3ef4023af0797681afc12d1379bfaf4d98a064837a9a2f70ee069\"" Feb 13 16:02:26.015020 systemd[1]: Started cri-containerd-19a730c978c3ef4023af0797681afc12d1379bfaf4d98a064837a9a2f70ee069.scope - libcontainer container 19a730c978c3ef4023af0797681afc12d1379bfaf4d98a064837a9a2f70ee069. Feb 13 16:02:26.031226 containerd[1568]: time="2025-02-13T16:02:26.031205828Z" level=info msg="StartContainer for \"19a730c978c3ef4023af0797681afc12d1379bfaf4d98a064837a9a2f70ee069\" returns successfully" Feb 13 16:02:26.857213 kubelet[2825]: I0213 16:02:26.857131 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-76c4976dd7-66hqp" podStartSLOduration=1.502100363 podStartE2EDuration="4.857116293s" podCreationTimestamp="2025-02-13 16:02:22 +0000 UTC" firstStartedPulling="2025-02-13 16:02:22.598281859 +0000 UTC m=+5.135636266" lastFinishedPulling="2025-02-13 16:02:25.953297788 +0000 UTC m=+8.490652196" observedRunningTime="2025-02-13 16:02:26.642700379 +0000 UTC m=+9.180054808" watchObservedRunningTime="2025-02-13 16:02:26.857116293 +0000 UTC m=+9.394470705" Feb 13 16:02:28.892969 systemd[1]: Created slice kubepods-besteffort-pod7c2dbe1d_3475_4583_8635_7156204f4a3f.slice - libcontainer container kubepods-besteffort-pod7c2dbe1d_3475_4583_8635_7156204f4a3f.slice. Feb 13 16:02:28.960321 kubelet[2825]: I0213 16:02:28.960292 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c2dbe1d-3475-4583-8635-7156204f4a3f-tigera-ca-bundle\") pod \"calico-typha-76787589f9-pk2bz\" (UID: \"7c2dbe1d-3475-4583-8635-7156204f4a3f\") " pod="calico-system/calico-typha-76787589f9-pk2bz" Feb 13 16:02:28.960321 kubelet[2825]: I0213 16:02:28.960322 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7c2dbe1d-3475-4583-8635-7156204f4a3f-typha-certs\") pod \"calico-typha-76787589f9-pk2bz\" (UID: \"7c2dbe1d-3475-4583-8635-7156204f4a3f\") " pod="calico-system/calico-typha-76787589f9-pk2bz" Feb 13 16:02:28.960640 kubelet[2825]: I0213 16:02:28.960338 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmzmd\" (UniqueName: \"kubernetes.io/projected/7c2dbe1d-3475-4583-8635-7156204f4a3f-kube-api-access-qmzmd\") pod \"calico-typha-76787589f9-pk2bz\" (UID: \"7c2dbe1d-3475-4583-8635-7156204f4a3f\") " pod="calico-system/calico-typha-76787589f9-pk2bz" Feb 13 16:02:29.062679 systemd[1]: Created slice kubepods-besteffort-pod0e0b7a81_6d09_49a6_8896_9ca8e689d7f5.slice - libcontainer container kubepods-besteffort-pod0e0b7a81_6d09_49a6_8896_9ca8e689d7f5.slice. Feb 13 16:02:29.158211 kubelet[2825]: E0213 16:02:29.158122 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:29.161246 kubelet[2825]: I0213 16:02:29.161170 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-var-run-calico\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161246 kubelet[2825]: I0213 16:02:29.161209 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-policysync\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161246 kubelet[2825]: I0213 16:02:29.161223 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-cni-net-dir\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161246 kubelet[2825]: I0213 16:02:29.161240 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-tigera-ca-bundle\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161399 kubelet[2825]: I0213 16:02:29.161254 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-node-certs\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161399 kubelet[2825]: I0213 16:02:29.161266 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-var-lib-calico\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161399 kubelet[2825]: I0213 16:02:29.161280 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-cni-log-dir\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161399 kubelet[2825]: I0213 16:02:29.161291 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-flexvol-driver-host\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161399 kubelet[2825]: I0213 16:02:29.161302 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxsf9\" (UniqueName: \"kubernetes.io/projected/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-kube-api-access-zxsf9\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161482 kubelet[2825]: I0213 16:02:29.161313 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-lib-modules\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161482 kubelet[2825]: I0213 16:02:29.161321 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-xtables-lock\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.161482 kubelet[2825]: I0213 16:02:29.161342 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/0e0b7a81-6d09-49a6-8896-9ca8e689d7f5-cni-bin-dir\") pod \"calico-node-gr9pp\" (UID: \"0e0b7a81-6d09-49a6-8896-9ca8e689d7f5\") " pod="calico-system/calico-node-gr9pp" Feb 13 16:02:29.202682 containerd[1568]: time="2025-02-13T16:02:29.202653550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-76787589f9-pk2bz,Uid:7c2dbe1d-3475-4583-8635-7156204f4a3f,Namespace:calico-system,Attempt:0,}" Feb 13 16:02:29.228995 containerd[1568]: time="2025-02-13T16:02:29.228469660Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:29.228995 containerd[1568]: time="2025-02-13T16:02:29.228519454Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:29.228995 containerd[1568]: time="2025-02-13T16:02:29.228530877Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:29.229578 containerd[1568]: time="2025-02-13T16:02:29.228804381Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:29.252204 systemd[1]: Started cri-containerd-6321420d9a6b30d58b2635048dc3136bf9481a277c81a0be412a60ad798bd51a.scope - libcontainer container 6321420d9a6b30d58b2635048dc3136bf9481a277c81a0be412a60ad798bd51a. Feb 13 16:02:29.261803 kubelet[2825]: I0213 16:02:29.261776 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fb1149e0-8e00-49ff-a8bd-416370ecd365-registration-dir\") pod \"csi-node-driver-cv4n7\" (UID: \"fb1149e0-8e00-49ff-a8bd-416370ecd365\") " pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:29.261874 kubelet[2825]: I0213 16:02:29.261828 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fb1149e0-8e00-49ff-a8bd-416370ecd365-socket-dir\") pod \"csi-node-driver-cv4n7\" (UID: \"fb1149e0-8e00-49ff-a8bd-416370ecd365\") " pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:29.261874 kubelet[2825]: I0213 16:02:29.261859 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvt5w\" (UniqueName: \"kubernetes.io/projected/fb1149e0-8e00-49ff-a8bd-416370ecd365-kube-api-access-cvt5w\") pod \"csi-node-driver-cv4n7\" (UID: \"fb1149e0-8e00-49ff-a8bd-416370ecd365\") " pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:29.261945 kubelet[2825]: I0213 16:02:29.261933 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/fb1149e0-8e00-49ff-a8bd-416370ecd365-varrun\") pod \"csi-node-driver-cv4n7\" (UID: \"fb1149e0-8e00-49ff-a8bd-416370ecd365\") " pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:29.262873 kubelet[2825]: I0213 16:02:29.261971 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fb1149e0-8e00-49ff-a8bd-416370ecd365-kubelet-dir\") pod \"csi-node-driver-cv4n7\" (UID: \"fb1149e0-8e00-49ff-a8bd-416370ecd365\") " pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:29.286952 kubelet[2825]: E0213 16:02:29.286889 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.287184 kubelet[2825]: W0213 16:02:29.287049 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.287184 kubelet[2825]: E0213 16:02:29.287082 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.287521 kubelet[2825]: E0213 16:02:29.287434 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.287582 kubelet[2825]: W0213 16:02:29.287573 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.287757 kubelet[2825]: E0213 16:02:29.287651 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.287930 kubelet[2825]: E0213 16:02:29.287820 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.287930 kubelet[2825]: W0213 16:02:29.287829 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.288243 kubelet[2825]: E0213 16:02:29.288180 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.288243 kubelet[2825]: W0213 16:02:29.288189 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.288243 kubelet[2825]: E0213 16:02:29.288198 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.288562 kubelet[2825]: E0213 16:02:29.288495 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.288562 kubelet[2825]: W0213 16:02:29.288502 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.288562 kubelet[2825]: E0213 16:02:29.288510 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.288792 kubelet[2825]: E0213 16:02:29.288716 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.288792 kubelet[2825]: W0213 16:02:29.288723 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.288792 kubelet[2825]: E0213 16:02:29.288730 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.288943 kubelet[2825]: E0213 16:02:29.288937 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.289719 kubelet[2825]: W0213 16:02:29.288979 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.289719 kubelet[2825]: E0213 16:02:29.288988 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.290288 kubelet[2825]: E0213 16:02:29.290054 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.290461 kubelet[2825]: W0213 16:02:29.290450 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.290556 kubelet[2825]: E0213 16:02:29.290541 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.290635 kubelet[2825]: E0213 16:02:29.290628 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.293919 kubelet[2825]: E0213 16:02:29.293889 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.293919 kubelet[2825]: W0213 16:02:29.293907 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.294178 kubelet[2825]: E0213 16:02:29.294138 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.299669 containerd[1568]: time="2025-02-13T16:02:29.299601888Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-76787589f9-pk2bz,Uid:7c2dbe1d-3475-4583-8635-7156204f4a3f,Namespace:calico-system,Attempt:0,} returns sandbox id \"6321420d9a6b30d58b2635048dc3136bf9481a277c81a0be412a60ad798bd51a\"" Feb 13 16:02:29.322225 containerd[1568]: time="2025-02-13T16:02:29.322193312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Feb 13 16:02:29.362872 kubelet[2825]: E0213 16:02:29.362849 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.362872 kubelet[2825]: W0213 16:02:29.362866 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.362872 kubelet[2825]: E0213 16:02:29.362880 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.363081 kubelet[2825]: E0213 16:02:29.363036 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.363081 kubelet[2825]: W0213 16:02:29.363042 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.363081 kubelet[2825]: E0213 16:02:29.363055 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.363211 kubelet[2825]: E0213 16:02:29.363199 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.363211 kubelet[2825]: W0213 16:02:29.363209 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.363264 kubelet[2825]: E0213 16:02:29.363217 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.363375 kubelet[2825]: E0213 16:02:29.363361 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.363375 kubelet[2825]: W0213 16:02:29.363371 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.363427 kubelet[2825]: E0213 16:02:29.363379 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.363529 kubelet[2825]: E0213 16:02:29.363517 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.363529 kubelet[2825]: W0213 16:02:29.363525 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.363578 kubelet[2825]: E0213 16:02:29.363549 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.363686 kubelet[2825]: E0213 16:02:29.363674 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.363686 kubelet[2825]: W0213 16:02:29.363682 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.363737 kubelet[2825]: E0213 16:02:29.363691 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.363806 kubelet[2825]: E0213 16:02:29.363792 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.363806 kubelet[2825]: W0213 16:02:29.363800 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.363806 kubelet[2825]: E0213 16:02:29.363805 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.363955 kubelet[2825]: E0213 16:02:29.363900 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.363955 kubelet[2825]: W0213 16:02:29.363915 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.363955 kubelet[2825]: E0213 16:02:29.363932 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.364061 kubelet[2825]: E0213 16:02:29.364049 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.364061 kubelet[2825]: W0213 16:02:29.364057 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.364145 kubelet[2825]: E0213 16:02:29.364130 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.364209 kubelet[2825]: E0213 16:02:29.364186 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.364209 kubelet[2825]: W0213 16:02:29.364195 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.364285 kubelet[2825]: E0213 16:02:29.364267 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.364285 kubelet[2825]: E0213 16:02:29.364281 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.364345 kubelet[2825]: W0213 16:02:29.364287 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.364345 kubelet[2825]: E0213 16:02:29.364304 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.364424 kubelet[2825]: E0213 16:02:29.364408 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.364424 kubelet[2825]: W0213 16:02:29.364418 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.364424 kubelet[2825]: E0213 16:02:29.364428 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.364716 kubelet[2825]: E0213 16:02:29.364516 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.364716 kubelet[2825]: W0213 16:02:29.364520 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.364716 kubelet[2825]: E0213 16:02:29.364700 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.365069 containerd[1568]: time="2025-02-13T16:02:29.365051406Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gr9pp,Uid:0e0b7a81-6d09-49a6-8896-9ca8e689d7f5,Namespace:calico-system,Attempt:0,}" Feb 13 16:02:29.365276 kubelet[2825]: E0213 16:02:29.365267 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.365276 kubelet[2825]: W0213 16:02:29.365275 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.365868 kubelet[2825]: E0213 16:02:29.365377 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.365868 kubelet[2825]: E0213 16:02:29.365400 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.365868 kubelet[2825]: W0213 16:02:29.365404 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.365868 kubelet[2825]: E0213 16:02:29.365440 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.365868 kubelet[2825]: E0213 16:02:29.365498 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.365868 kubelet[2825]: W0213 16:02:29.365503 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.365868 kubelet[2825]: E0213 16:02:29.365558 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.365868 kubelet[2825]: E0213 16:02:29.365647 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.365868 kubelet[2825]: W0213 16:02:29.365653 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.365868 kubelet[2825]: E0213 16:02:29.365662 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.366607 kubelet[2825]: E0213 16:02:29.365931 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.366607 kubelet[2825]: W0213 16:02:29.365938 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.366607 kubelet[2825]: E0213 16:02:29.365946 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.366607 kubelet[2825]: E0213 16:02:29.366243 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.366607 kubelet[2825]: W0213 16:02:29.366249 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.366607 kubelet[2825]: E0213 16:02:29.366257 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.366607 kubelet[2825]: E0213 16:02:29.366357 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.366607 kubelet[2825]: W0213 16:02:29.366364 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.366607 kubelet[2825]: E0213 16:02:29.366373 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.366999 kubelet[2825]: E0213 16:02:29.366794 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.366999 kubelet[2825]: W0213 16:02:29.366801 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.366999 kubelet[2825]: E0213 16:02:29.366817 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.366999 kubelet[2825]: E0213 16:02:29.366892 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.366999 kubelet[2825]: W0213 16:02:29.366896 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.367092 kubelet[2825]: E0213 16:02:29.367045 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.367092 kubelet[2825]: W0213 16:02:29.367050 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.367092 kubelet[2825]: E0213 16:02:29.367056 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.367092 kubelet[2825]: E0213 16:02:29.367071 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.367500 kubelet[2825]: E0213 16:02:29.367489 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.367500 kubelet[2825]: W0213 16:02:29.367495 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.367686 kubelet[2825]: E0213 16:02:29.367502 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.368037 kubelet[2825]: E0213 16:02:29.368003 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.368037 kubelet[2825]: W0213 16:02:29.368012 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.368037 kubelet[2825]: E0213 16:02:29.368020 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.371966 kubelet[2825]: E0213 16:02:29.371888 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.371966 kubelet[2825]: W0213 16:02:29.371908 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.371966 kubelet[2825]: E0213 16:02:29.371928 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.380997 containerd[1568]: time="2025-02-13T16:02:29.380934690Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:29.380997 containerd[1568]: time="2025-02-13T16:02:29.380996161Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:29.381188 containerd[1568]: time="2025-02-13T16:02:29.381013987Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:29.381188 containerd[1568]: time="2025-02-13T16:02:29.381085647Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:29.396081 systemd[1]: Started cri-containerd-7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff.scope - libcontainer container 7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff. Feb 13 16:02:29.414696 containerd[1568]: time="2025-02-13T16:02:29.414627806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gr9pp,Uid:0e0b7a81-6d09-49a6-8896-9ca8e689d7f5,Namespace:calico-system,Attempt:0,} returns sandbox id \"7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff\"" Feb 13 16:02:29.656866 kubelet[2825]: E0213 16:02:29.656846 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.657005 kubelet[2825]: W0213 16:02:29.656993 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.657106 kubelet[2825]: E0213 16:02:29.657039 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.657242 kubelet[2825]: E0213 16:02:29.657166 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.657242 kubelet[2825]: W0213 16:02:29.657179 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.657242 kubelet[2825]: E0213 16:02:29.657185 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.657371 kubelet[2825]: E0213 16:02:29.657363 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.657413 kubelet[2825]: W0213 16:02:29.657407 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.657487 kubelet[2825]: E0213 16:02:29.657440 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.657606 kubelet[2825]: E0213 16:02:29.657543 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.657606 kubelet[2825]: W0213 16:02:29.657549 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.657606 kubelet[2825]: E0213 16:02:29.657554 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.657706 kubelet[2825]: E0213 16:02:29.657700 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.657740 kubelet[2825]: W0213 16:02:29.657733 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.657783 kubelet[2825]: E0213 16:02:29.657777 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.657964 kubelet[2825]: E0213 16:02:29.657913 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.657964 kubelet[2825]: W0213 16:02:29.657919 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.657964 kubelet[2825]: E0213 16:02:29.657925 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.658070 kubelet[2825]: E0213 16:02:29.658063 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.658152 kubelet[2825]: W0213 16:02:29.658102 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.658152 kubelet[2825]: E0213 16:02:29.658110 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.658228 kubelet[2825]: E0213 16:02:29.658222 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.658262 kubelet[2825]: W0213 16:02:29.658257 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.658295 kubelet[2825]: E0213 16:02:29.658290 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.658473 kubelet[2825]: E0213 16:02:29.658424 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.658473 kubelet[2825]: W0213 16:02:29.658430 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.658473 kubelet[2825]: E0213 16:02:29.658435 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.658608 kubelet[2825]: E0213 16:02:29.658602 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.658654 kubelet[2825]: W0213 16:02:29.658639 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.658736 kubelet[2825]: E0213 16:02:29.658684 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.658843 kubelet[2825]: E0213 16:02:29.658837 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.658952 kubelet[2825]: W0213 16:02:29.658875 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.658952 kubelet[2825]: E0213 16:02:29.658882 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.659051 kubelet[2825]: E0213 16:02:29.659045 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.659087 kubelet[2825]: W0213 16:02:29.659082 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.659132 kubelet[2825]: E0213 16:02:29.659126 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.659340 kubelet[2825]: E0213 16:02:29.659278 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.659340 kubelet[2825]: W0213 16:02:29.659284 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.659340 kubelet[2825]: E0213 16:02:29.659290 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.659459 kubelet[2825]: E0213 16:02:29.659451 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.659545 kubelet[2825]: W0213 16:02:29.659490 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.659545 kubelet[2825]: E0213 16:02:29.659498 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:29.659618 kubelet[2825]: E0213 16:02:29.659613 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:29.659651 kubelet[2825]: W0213 16:02:29.659646 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:29.659680 kubelet[2825]: E0213 16:02:29.659675 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:30.086830 systemd[1]: run-containerd-runc-k8s.io-6321420d9a6b30d58b2635048dc3136bf9481a277c81a0be412a60ad798bd51a-runc.cjW1mf.mount: Deactivated successfully. Feb 13 16:02:30.590697 kubelet[2825]: E0213 16:02:30.590621 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:30.873066 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3468457193.mount: Deactivated successfully. Feb 13 16:02:32.329315 containerd[1568]: time="2025-02-13T16:02:32.329273836Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:32.343571 containerd[1568]: time="2025-02-13T16:02:32.343422693Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Feb 13 16:02:32.356568 containerd[1568]: time="2025-02-13T16:02:32.356523442Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:32.368430 containerd[1568]: time="2025-02-13T16:02:32.368384058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:32.369192 containerd[1568]: time="2025-02-13T16:02:32.368789456Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 3.046559097s" Feb 13 16:02:32.369192 containerd[1568]: time="2025-02-13T16:02:32.368813228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Feb 13 16:02:32.369513 containerd[1568]: time="2025-02-13T16:02:32.369491980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 16:02:32.440215 containerd[1568]: time="2025-02-13T16:02:32.440123844Z" level=info msg="CreateContainer within sandbox \"6321420d9a6b30d58b2635048dc3136bf9481a277c81a0be412a60ad798bd51a\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 16:02:32.456062 containerd[1568]: time="2025-02-13T16:02:32.455991728Z" level=info msg="CreateContainer within sandbox \"6321420d9a6b30d58b2635048dc3136bf9481a277c81a0be412a60ad798bd51a\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"7df0bdd6b83da77dc4edc1c2a5bd73b96c3323f5ea69704e0fe603087c31b1ba\"" Feb 13 16:02:32.456327 containerd[1568]: time="2025-02-13T16:02:32.456313147Z" level=info msg="StartContainer for \"7df0bdd6b83da77dc4edc1c2a5bd73b96c3323f5ea69704e0fe603087c31b1ba\"" Feb 13 16:02:32.493144 systemd[1]: Started cri-containerd-7df0bdd6b83da77dc4edc1c2a5bd73b96c3323f5ea69704e0fe603087c31b1ba.scope - libcontainer container 7df0bdd6b83da77dc4edc1c2a5bd73b96c3323f5ea69704e0fe603087c31b1ba. Feb 13 16:02:32.534432 containerd[1568]: time="2025-02-13T16:02:32.534360617Z" level=info msg="StartContainer for \"7df0bdd6b83da77dc4edc1c2a5bd73b96c3323f5ea69704e0fe603087c31b1ba\" returns successfully" Feb 13 16:02:32.593629 kubelet[2825]: E0213 16:02:32.593376 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:32.858308 kubelet[2825]: I0213 16:02:32.858268 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-76787589f9-pk2bz" podStartSLOduration=1.807332636 podStartE2EDuration="4.858256922s" podCreationTimestamp="2025-02-13 16:02:28 +0000 UTC" firstStartedPulling="2025-02-13 16:02:29.318406145 +0000 UTC m=+11.855760551" lastFinishedPulling="2025-02-13 16:02:32.369330424 +0000 UTC m=+14.906684837" observedRunningTime="2025-02-13 16:02:32.857067866 +0000 UTC m=+15.394422275" watchObservedRunningTime="2025-02-13 16:02:32.858256922 +0000 UTC m=+15.395611326" Feb 13 16:02:32.921450 kubelet[2825]: E0213 16:02:32.921404 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.921450 kubelet[2825]: W0213 16:02:32.921422 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.921450 kubelet[2825]: E0213 16:02:32.921454 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.921618 kubelet[2825]: E0213 16:02:32.921560 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.921618 kubelet[2825]: W0213 16:02:32.921565 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.921618 kubelet[2825]: E0213 16:02:32.921570 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.921692 kubelet[2825]: E0213 16:02:32.921670 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.921692 kubelet[2825]: W0213 16:02:32.921675 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.921692 kubelet[2825]: E0213 16:02:32.921680 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.921784 kubelet[2825]: E0213 16:02:32.921776 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.921784 kubelet[2825]: W0213 16:02:32.921781 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.921839 kubelet[2825]: E0213 16:02:32.921786 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.921892 kubelet[2825]: E0213 16:02:32.921882 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.921892 kubelet[2825]: W0213 16:02:32.921889 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.921955 kubelet[2825]: E0213 16:02:32.921894 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.921993 kubelet[2825]: E0213 16:02:32.921979 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.921993 kubelet[2825]: W0213 16:02:32.921983 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.921993 kubelet[2825]: E0213 16:02:32.921988 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922091 kubelet[2825]: E0213 16:02:32.922082 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922091 kubelet[2825]: W0213 16:02:32.922088 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922136 kubelet[2825]: E0213 16:02:32.922094 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922198 kubelet[2825]: E0213 16:02:32.922189 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922198 kubelet[2825]: W0213 16:02:32.922196 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922244 kubelet[2825]: E0213 16:02:32.922201 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922289 kubelet[2825]: E0213 16:02:32.922281 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922289 kubelet[2825]: W0213 16:02:32.922288 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922331 kubelet[2825]: E0213 16:02:32.922292 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922392 kubelet[2825]: E0213 16:02:32.922383 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922392 kubelet[2825]: W0213 16:02:32.922389 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922441 kubelet[2825]: E0213 16:02:32.922394 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922492 kubelet[2825]: E0213 16:02:32.922484 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922492 kubelet[2825]: W0213 16:02:32.922490 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922534 kubelet[2825]: E0213 16:02:32.922495 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922581 kubelet[2825]: E0213 16:02:32.922571 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922581 kubelet[2825]: W0213 16:02:32.922578 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922629 kubelet[2825]: E0213 16:02:32.922597 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922685 kubelet[2825]: E0213 16:02:32.922677 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922685 kubelet[2825]: W0213 16:02:32.922684 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922726 kubelet[2825]: E0213 16:02:32.922689 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922773 kubelet[2825]: E0213 16:02:32.922763 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922773 kubelet[2825]: W0213 16:02:32.922771 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922813 kubelet[2825]: E0213 16:02:32.922775 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.922883 kubelet[2825]: E0213 16:02:32.922875 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.922883 kubelet[2825]: W0213 16:02:32.922881 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.922939 kubelet[2825]: E0213 16:02:32.922908 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.941300 kubelet[2825]: E0213 16:02:32.941284 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.941464 kubelet[2825]: W0213 16:02:32.941391 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.941464 kubelet[2825]: E0213 16:02:32.941405 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.941670 kubelet[2825]: E0213 16:02:32.941559 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.941670 kubelet[2825]: W0213 16:02:32.941564 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.941670 kubelet[2825]: E0213 16:02:32.941569 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.942046 kubelet[2825]: E0213 16:02:32.941781 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.942046 kubelet[2825]: W0213 16:02:32.941787 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.942046 kubelet[2825]: E0213 16:02:32.941792 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.942046 kubelet[2825]: E0213 16:02:32.941943 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.942046 kubelet[2825]: W0213 16:02:32.941948 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.942046 kubelet[2825]: E0213 16:02:32.941954 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.942370 kubelet[2825]: E0213 16:02:32.942213 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.942370 kubelet[2825]: W0213 16:02:32.942219 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.942370 kubelet[2825]: E0213 16:02:32.942228 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.942370 kubelet[2825]: E0213 16:02:32.942338 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.942370 kubelet[2825]: W0213 16:02:32.942346 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.942370 kubelet[2825]: E0213 16:02:32.942356 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.942503 kubelet[2825]: E0213 16:02:32.942445 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.942503 kubelet[2825]: W0213 16:02:32.942450 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.942503 kubelet[2825]: E0213 16:02:32.942462 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.942568 kubelet[2825]: E0213 16:02:32.942556 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.942568 kubelet[2825]: W0213 16:02:32.942565 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.942624 kubelet[2825]: E0213 16:02:32.942573 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.949473 kubelet[2825]: E0213 16:02:32.949461 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.949473 kubelet[2825]: W0213 16:02:32.949470 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.949524 kubelet[2825]: E0213 16:02:32.949478 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.949763 kubelet[2825]: E0213 16:02:32.949700 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.949763 kubelet[2825]: W0213 16:02:32.949707 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.949763 kubelet[2825]: E0213 16:02:32.949717 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.949917 kubelet[2825]: E0213 16:02:32.949824 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.949917 kubelet[2825]: W0213 16:02:32.949830 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.949917 kubelet[2825]: E0213 16:02:32.949843 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.950005 kubelet[2825]: E0213 16:02:32.949999 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.950039 kubelet[2825]: W0213 16:02:32.950034 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.950089 kubelet[2825]: E0213 16:02:32.950078 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.950250 kubelet[2825]: E0213 16:02:32.950190 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.950250 kubelet[2825]: W0213 16:02:32.950196 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.950250 kubelet[2825]: E0213 16:02:32.950204 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.950344 kubelet[2825]: E0213 16:02:32.950338 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.950376 kubelet[2825]: W0213 16:02:32.950371 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.950413 kubelet[2825]: E0213 16:02:32.950407 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.950543 kubelet[2825]: E0213 16:02:32.950536 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.950664 kubelet[2825]: W0213 16:02:32.950575 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.950664 kubelet[2825]: E0213 16:02:32.950587 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.950756 kubelet[2825]: E0213 16:02:32.950743 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.950756 kubelet[2825]: W0213 16:02:32.950752 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.950797 kubelet[2825]: E0213 16:02:32.950761 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.950930 kubelet[2825]: E0213 16:02:32.950924 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.950972 kubelet[2825]: W0213 16:02:32.950965 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.951021 kubelet[2825]: E0213 16:02:32.951008 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:32.951105 kubelet[2825]: E0213 16:02:32.951093 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:32.951105 kubelet[2825]: W0213 16:02:32.951103 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:32.951147 kubelet[2825]: E0213 16:02:32.951109 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.849036 kubelet[2825]: I0213 16:02:33.849014 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:02:33.927127 containerd[1568]: time="2025-02-13T16:02:33.926818181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:33.927560 containerd[1568]: time="2025-02-13T16:02:33.927539970Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Feb 13 16:02:33.929015 kubelet[2825]: E0213 16:02:33.928999 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.929015 kubelet[2825]: W0213 16:02:33.929013 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.929090 kubelet[2825]: E0213 16:02:33.929026 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.929595 containerd[1568]: time="2025-02-13T16:02:33.928116829Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:33.929632 kubelet[2825]: E0213 16:02:33.929620 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.929632 kubelet[2825]: W0213 16:02:33.929626 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.929670 kubelet[2825]: E0213 16:02:33.929632 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.929731 kubelet[2825]: E0213 16:02:33.929724 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.929731 kubelet[2825]: W0213 16:02:33.929730 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.929782 kubelet[2825]: E0213 16:02:33.929735 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.929828 kubelet[2825]: E0213 16:02:33.929821 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.929853 kubelet[2825]: W0213 16:02:33.929828 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.929853 kubelet[2825]: E0213 16:02:33.929833 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.930003 kubelet[2825]: E0213 16:02:33.929969 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.930003 kubelet[2825]: W0213 16:02:33.929975 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.930003 kubelet[2825]: E0213 16:02:33.929981 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.930374 kubelet[2825]: E0213 16:02:33.930167 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.930374 kubelet[2825]: W0213 16:02:33.930173 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.930374 kubelet[2825]: E0213 16:02:33.930179 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.930374 kubelet[2825]: E0213 16:02:33.930374 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.930469 kubelet[2825]: W0213 16:02:33.930379 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.930469 kubelet[2825]: E0213 16:02:33.930384 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.930513 kubelet[2825]: E0213 16:02:33.930479 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.930513 kubelet[2825]: W0213 16:02:33.930484 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.930513 kubelet[2825]: E0213 16:02:33.930489 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.930879 kubelet[2825]: E0213 16:02:33.930600 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.930879 kubelet[2825]: W0213 16:02:33.930605 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.930879 kubelet[2825]: E0213 16:02:33.930612 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.930879 kubelet[2825]: E0213 16:02:33.930715 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.930879 kubelet[2825]: W0213 16:02:33.930728 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.930879 kubelet[2825]: E0213 16:02:33.930746 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.930879 kubelet[2825]: E0213 16:02:33.930843 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.930879 kubelet[2825]: W0213 16:02:33.930848 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.930879 kubelet[2825]: E0213 16:02:33.930853 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.931155 kubelet[2825]: E0213 16:02:33.930952 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.931155 kubelet[2825]: W0213 16:02:33.930956 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.931155 kubelet[2825]: E0213 16:02:33.930961 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.931155 kubelet[2825]: E0213 16:02:33.931066 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.931155 kubelet[2825]: W0213 16:02:33.931071 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.931155 kubelet[2825]: E0213 16:02:33.931076 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.931367 kubelet[2825]: E0213 16:02:33.931161 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.931367 kubelet[2825]: W0213 16:02:33.931166 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.931367 kubelet[2825]: E0213 16:02:33.931170 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.931367 kubelet[2825]: E0213 16:02:33.931266 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.931367 kubelet[2825]: W0213 16:02:33.931271 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.931367 kubelet[2825]: E0213 16:02:33.931275 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.932948 containerd[1568]: time="2025-02-13T16:02:33.932155157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:33.932948 containerd[1568]: time="2025-02-13T16:02:33.932912668Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.563322282s" Feb 13 16:02:33.933038 containerd[1568]: time="2025-02-13T16:02:33.933027540Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 16:02:33.943539 containerd[1568]: time="2025-02-13T16:02:33.943509454Z" level=info msg="CreateContainer within sandbox \"7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 16:02:33.947821 kubelet[2825]: E0213 16:02:33.947799 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.947821 kubelet[2825]: W0213 16:02:33.947817 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.948019 kubelet[2825]: E0213 16:02:33.947854 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.948133 kubelet[2825]: E0213 16:02:33.948039 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.948133 kubelet[2825]: W0213 16:02:33.948046 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.948133 kubelet[2825]: E0213 16:02:33.948054 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.948236 kubelet[2825]: E0213 16:02:33.948171 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.948236 kubelet[2825]: W0213 16:02:33.948175 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.948236 kubelet[2825]: E0213 16:02:33.948181 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.949118 kubelet[2825]: E0213 16:02:33.948298 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.949118 kubelet[2825]: W0213 16:02:33.948316 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.949118 kubelet[2825]: E0213 16:02:33.948327 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.949118 kubelet[2825]: E0213 16:02:33.948549 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.949118 kubelet[2825]: W0213 16:02:33.948555 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.949118 kubelet[2825]: E0213 16:02:33.948561 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.949118 kubelet[2825]: E0213 16:02:33.948672 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.949118 kubelet[2825]: W0213 16:02:33.948685 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.949118 kubelet[2825]: E0213 16:02:33.948690 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.949118 kubelet[2825]: E0213 16:02:33.948789 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.949347 kubelet[2825]: W0213 16:02:33.948793 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.949347 kubelet[2825]: E0213 16:02:33.948798 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.949347 kubelet[2825]: E0213 16:02:33.948995 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.949347 kubelet[2825]: W0213 16:02:33.949000 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.949347 kubelet[2825]: E0213 16:02:33.949005 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.949347 kubelet[2825]: E0213 16:02:33.949191 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.949347 kubelet[2825]: W0213 16:02:33.949196 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.949347 kubelet[2825]: E0213 16:02:33.949205 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.951932 kubelet[2825]: E0213 16:02:33.950529 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.951932 kubelet[2825]: W0213 16:02:33.950536 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.952135 kubelet[2825]: E0213 16:02:33.952072 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.952204 kubelet[2825]: E0213 16:02:33.952193 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.952204 kubelet[2825]: W0213 16:02:33.952202 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.952314 kubelet[2825]: E0213 16:02:33.952212 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.952314 kubelet[2825]: E0213 16:02:33.952312 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.952350 kubelet[2825]: W0213 16:02:33.952317 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.952350 kubelet[2825]: E0213 16:02:33.952322 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.952599 kubelet[2825]: E0213 16:02:33.952590 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.952599 kubelet[2825]: W0213 16:02:33.952597 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.952693 kubelet[2825]: E0213 16:02:33.952605 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.953159 kubelet[2825]: E0213 16:02:33.953090 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.953159 kubelet[2825]: W0213 16:02:33.953098 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.953159 kubelet[2825]: E0213 16:02:33.953108 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.953464 kubelet[2825]: E0213 16:02:33.953387 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.953464 kubelet[2825]: W0213 16:02:33.953395 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.953464 kubelet[2825]: E0213 16:02:33.953403 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.953661 containerd[1568]: time="2025-02-13T16:02:33.953643256Z" level=info msg="CreateContainer within sandbox \"7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337\"" Feb 13 16:02:33.953794 kubelet[2825]: E0213 16:02:33.953716 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.953794 kubelet[2825]: W0213 16:02:33.953723 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.953794 kubelet[2825]: E0213 16:02:33.953736 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.954848 kubelet[2825]: E0213 16:02:33.954788 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.954848 kubelet[2825]: W0213 16:02:33.954798 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.954848 kubelet[2825]: E0213 16:02:33.954805 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.955693 kubelet[2825]: E0213 16:02:33.955651 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:02:33.955693 kubelet[2825]: W0213 16:02:33.955659 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:02:33.955693 kubelet[2825]: E0213 16:02:33.955667 2825 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:02:33.955796 containerd[1568]: time="2025-02-13T16:02:33.955663915Z" level=info msg="StartContainer for \"f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337\"" Feb 13 16:02:33.981268 systemd[1]: run-containerd-runc-k8s.io-f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337-runc.2hJDwM.mount: Deactivated successfully. Feb 13 16:02:33.987012 systemd[1]: Started cri-containerd-f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337.scope - libcontainer container f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337. Feb 13 16:02:34.005162 containerd[1568]: time="2025-02-13T16:02:34.005136594Z" level=info msg="StartContainer for \"f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337\" returns successfully" Feb 13 16:02:34.013591 systemd[1]: cri-containerd-f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337.scope: Deactivated successfully. Feb 13 16:02:34.457213 containerd[1568]: time="2025-02-13T16:02:34.437581620Z" level=info msg="shim disconnected" id=f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337 namespace=k8s.io Feb 13 16:02:34.457213 containerd[1568]: time="2025-02-13T16:02:34.457138487Z" level=warning msg="cleaning up after shim disconnected" id=f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337 namespace=k8s.io Feb 13 16:02:34.457213 containerd[1568]: time="2025-02-13T16:02:34.457150017Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:02:34.464482 containerd[1568]: time="2025-02-13T16:02:34.464451019Z" level=warning msg="cleanup warnings time=\"2025-02-13T16:02:34Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Feb 13 16:02:34.590783 kubelet[2825]: E0213 16:02:34.590460 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:34.852973 containerd[1568]: time="2025-02-13T16:02:34.852894622Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 16:02:34.949581 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f3ce5bca31e592125465e2a8c605cf4213e65d660e5ee464d8daf79e5537c337-rootfs.mount: Deactivated successfully. Feb 13 16:02:36.590971 kubelet[2825]: E0213 16:02:36.590943 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:38.591169 kubelet[2825]: E0213 16:02:38.590841 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:38.823246 containerd[1568]: time="2025-02-13T16:02:38.823195176Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:38.823737 containerd[1568]: time="2025-02-13T16:02:38.823673350Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 16:02:38.831761 containerd[1568]: time="2025-02-13T16:02:38.831718125Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:38.842093 containerd[1568]: time="2025-02-13T16:02:38.842025067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:38.844835 containerd[1568]: time="2025-02-13T16:02:38.844811672Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 3.989183394s" Feb 13 16:02:38.844882 containerd[1568]: time="2025-02-13T16:02:38.844836490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 16:02:38.848215 containerd[1568]: time="2025-02-13T16:02:38.848166716Z" level=info msg="CreateContainer within sandbox \"7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 16:02:38.920728 containerd[1568]: time="2025-02-13T16:02:38.920679260Z" level=info msg="CreateContainer within sandbox \"7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889\"" Feb 13 16:02:38.921076 containerd[1568]: time="2025-02-13T16:02:38.921003452Z" level=info msg="StartContainer for \"ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889\"" Feb 13 16:02:38.958563 systemd[1]: run-containerd-runc-k8s.io-ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889-runc.E9h8uY.mount: Deactivated successfully. Feb 13 16:02:38.963994 systemd[1]: Started cri-containerd-ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889.scope - libcontainer container ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889. Feb 13 16:02:38.983214 containerd[1568]: time="2025-02-13T16:02:38.983143205Z" level=info msg="StartContainer for \"ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889\" returns successfully" Feb 13 16:02:39.961361 systemd[1]: cri-containerd-ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889.scope: Deactivated successfully. Feb 13 16:02:39.961550 systemd[1]: cri-containerd-ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889.scope: Consumed 276ms CPU time, 145.7M memory peak, 12K read from disk, 151M written to disk. Feb 13 16:02:39.979743 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889-rootfs.mount: Deactivated successfully. Feb 13 16:02:39.987777 containerd[1568]: time="2025-02-13T16:02:39.987739300Z" level=info msg="shim disconnected" id=ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889 namespace=k8s.io Feb 13 16:02:39.987777 containerd[1568]: time="2025-02-13T16:02:39.987775910Z" level=warning msg="cleaning up after shim disconnected" id=ae039b524e5e5394c235fdcd5b9d104e32cd4b2789cda36d44bc08f8d6db1889 namespace=k8s.io Feb 13 16:02:39.987998 containerd[1568]: time="2025-02-13T16:02:39.987781909Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:02:39.996850 containerd[1568]: time="2025-02-13T16:02:39.996816565Z" level=warning msg="cleanup warnings time=\"2025-02-13T16:02:39Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Feb 13 16:02:40.006621 kubelet[2825]: I0213 16:02:40.006606 2825 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Feb 13 16:02:40.029371 systemd[1]: Created slice kubepods-burstable-pod661d4ddb_a267_4645_9fad_e5fa882fa0db.slice - libcontainer container kubepods-burstable-pod661d4ddb_a267_4645_9fad_e5fa882fa0db.slice. Feb 13 16:02:40.035570 kubelet[2825]: W0213 16:02:40.035548 2825 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'localhost' and this object Feb 13 16:02:40.035652 kubelet[2825]: E0213 16:02:40.035576 2825 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:localhost\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Feb 13 16:02:40.038204 systemd[1]: Created slice kubepods-burstable-pod68c42bcf_2d0d_494d_9abe_69e38a01bdd9.slice - libcontainer container kubepods-burstable-pod68c42bcf_2d0d_494d_9abe_69e38a01bdd9.slice. Feb 13 16:02:40.044287 systemd[1]: Created slice kubepods-besteffort-pod994d859a_3838_4bb4_a531_0bff4b1bbeaf.slice - libcontainer container kubepods-besteffort-pod994d859a_3838_4bb4_a531_0bff4b1bbeaf.slice. Feb 13 16:02:40.050907 systemd[1]: Created slice kubepods-besteffort-podc05aa49b_8336_4728_802a_0e8f6d326479.slice - libcontainer container kubepods-besteffort-podc05aa49b_8336_4728_802a_0e8f6d326479.slice. Feb 13 16:02:40.055557 systemd[1]: Created slice kubepods-besteffort-podba8e0efe_98a0_4c27_9797_c702cafd7556.slice - libcontainer container kubepods-besteffort-podba8e0efe_98a0_4c27_9797_c702cafd7556.slice. Feb 13 16:02:40.088012 kubelet[2825]: I0213 16:02:40.087810 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/68c42bcf-2d0d-494d-9abe-69e38a01bdd9-config-volume\") pod \"coredns-6f6b679f8f-dt72t\" (UID: \"68c42bcf-2d0d-494d-9abe-69e38a01bdd9\") " pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:40.088012 kubelet[2825]: I0213 16:02:40.087840 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr7bl\" (UniqueName: \"kubernetes.io/projected/68c42bcf-2d0d-494d-9abe-69e38a01bdd9-kube-api-access-lr7bl\") pod \"coredns-6f6b679f8f-dt72t\" (UID: \"68c42bcf-2d0d-494d-9abe-69e38a01bdd9\") " pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:40.088012 kubelet[2825]: I0213 16:02:40.087853 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05aa49b-8336-4728-802a-0e8f6d326479-tigera-ca-bundle\") pod \"calico-kube-controllers-cd99f8587-nvllq\" (UID: \"c05aa49b-8336-4728-802a-0e8f6d326479\") " pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:40.088012 kubelet[2825]: I0213 16:02:40.087863 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx25q\" (UniqueName: \"kubernetes.io/projected/994d859a-3838-4bb4-a531-0bff4b1bbeaf-kube-api-access-lx25q\") pod \"calico-apiserver-64877df8f5-rqk9c\" (UID: \"994d859a-3838-4bb4-a531-0bff4b1bbeaf\") " pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:40.088012 kubelet[2825]: I0213 16:02:40.087873 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/994d859a-3838-4bb4-a531-0bff4b1bbeaf-calico-apiserver-certs\") pod \"calico-apiserver-64877df8f5-rqk9c\" (UID: \"994d859a-3838-4bb4-a531-0bff4b1bbeaf\") " pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:40.088577 kubelet[2825]: I0213 16:02:40.087882 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/661d4ddb-a267-4645-9fad-e5fa882fa0db-config-volume\") pod \"coredns-6f6b679f8f-d2wxt\" (UID: \"661d4ddb-a267-4645-9fad-e5fa882fa0db\") " pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:40.088577 kubelet[2825]: I0213 16:02:40.087890 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b46n\" (UniqueName: \"kubernetes.io/projected/661d4ddb-a267-4645-9fad-e5fa882fa0db-kube-api-access-9b46n\") pod \"coredns-6f6b679f8f-d2wxt\" (UID: \"661d4ddb-a267-4645-9fad-e5fa882fa0db\") " pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:40.088577 kubelet[2825]: I0213 16:02:40.087910 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ba8e0efe-98a0-4c27-9797-c702cafd7556-calico-apiserver-certs\") pod \"calico-apiserver-64877df8f5-fjq62\" (UID: \"ba8e0efe-98a0-4c27-9797-c702cafd7556\") " pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:40.088577 kubelet[2825]: I0213 16:02:40.087922 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76mpg\" (UniqueName: \"kubernetes.io/projected/c05aa49b-8336-4728-802a-0e8f6d326479-kube-api-access-76mpg\") pod \"calico-kube-controllers-cd99f8587-nvllq\" (UID: \"c05aa49b-8336-4728-802a-0e8f6d326479\") " pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:40.088577 kubelet[2825]: I0213 16:02:40.087958 2825 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj4wr\" (UniqueName: \"kubernetes.io/projected/ba8e0efe-98a0-4c27-9797-c702cafd7556-kube-api-access-zj4wr\") pod \"calico-apiserver-64877df8f5-fjq62\" (UID: \"ba8e0efe-98a0-4c27-9797-c702cafd7556\") " pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:40.221735 kubelet[2825]: I0213 16:02:40.220818 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:02:40.335651 containerd[1568]: time="2025-02-13T16:02:40.335617788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:0,}" Feb 13 16:02:40.342241 containerd[1568]: time="2025-02-13T16:02:40.342219656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:0,}" Feb 13 16:02:40.354570 containerd[1568]: time="2025-02-13T16:02:40.354419566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:0,}" Feb 13 16:02:40.522976 containerd[1568]: time="2025-02-13T16:02:40.522670214Z" level=error msg="Failed to destroy network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.523378 containerd[1568]: time="2025-02-13T16:02:40.523181512Z" level=error msg="Failed to destroy network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.525459 containerd[1568]: time="2025-02-13T16:02:40.525443536Z" level=error msg="encountered an error cleaning up failed sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.525548 containerd[1568]: time="2025-02-13T16:02:40.525535671Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.525740 containerd[1568]: time="2025-02-13T16:02:40.525446471Z" level=error msg="encountered an error cleaning up failed sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.525799 containerd[1568]: time="2025-02-13T16:02:40.525787707Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.528135 containerd[1568]: time="2025-02-13T16:02:40.528090461Z" level=error msg="Failed to destroy network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.529017 containerd[1568]: time="2025-02-13T16:02:40.528947476Z" level=error msg="encountered an error cleaning up failed sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.529017 containerd[1568]: time="2025-02-13T16:02:40.528989522Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.529823 kubelet[2825]: E0213 16:02:40.529640 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.529823 kubelet[2825]: E0213 16:02:40.529686 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:40.529823 kubelet[2825]: E0213 16:02:40.529734 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:40.529899 kubelet[2825]: E0213 16:02:40.529769 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d2wxt" podUID="661d4ddb-a267-4645-9fad-e5fa882fa0db" Feb 13 16:02:40.529899 kubelet[2825]: E0213 16:02:40.529798 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.529899 kubelet[2825]: E0213 16:02:40.529811 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:40.530039 kubelet[2825]: E0213 16:02:40.529820 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:40.530039 kubelet[2825]: E0213 16:02:40.529832 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" podUID="c05aa49b-8336-4728-802a-0e8f6d326479" Feb 13 16:02:40.530186 kubelet[2825]: E0213 16:02:40.530106 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.530186 kubelet[2825]: E0213 16:02:40.530133 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:40.530186 kubelet[2825]: E0213 16:02:40.530145 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:40.530842 kubelet[2825]: E0213 16:02:40.530163 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dt72t" podUID="68c42bcf-2d0d-494d-9abe-69e38a01bdd9" Feb 13 16:02:40.595536 systemd[1]: Created slice kubepods-besteffort-podfb1149e0_8e00_49ff_a8bd_416370ecd365.slice - libcontainer container kubepods-besteffort-podfb1149e0_8e00_49ff_a8bd_416370ecd365.slice. Feb 13 16:02:40.597124 containerd[1568]: time="2025-02-13T16:02:40.597075975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:0,}" Feb 13 16:02:40.636643 containerd[1568]: time="2025-02-13T16:02:40.636600477Z" level=error msg="Failed to destroy network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.641187 containerd[1568]: time="2025-02-13T16:02:40.636814238Z" level=error msg="encountered an error cleaning up failed sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.641187 containerd[1568]: time="2025-02-13T16:02:40.636865124Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.641250 kubelet[2825]: E0213 16:02:40.636999 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.641250 kubelet[2825]: E0213 16:02:40.637031 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:40.641250 kubelet[2825]: E0213 16:02:40.637047 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:40.641313 kubelet[2825]: E0213 16:02:40.637092 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:40.865231 containerd[1568]: time="2025-02-13T16:02:40.865204304Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 16:02:40.866787 kubelet[2825]: I0213 16:02:40.866032 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010" Feb 13 16:02:40.867399 containerd[1568]: time="2025-02-13T16:02:40.867299300Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:02:40.871554 kubelet[2825]: I0213 16:02:40.869943 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd" Feb 13 16:02:40.871645 containerd[1568]: time="2025-02-13T16:02:40.870288245Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:02:40.876376 containerd[1568]: time="2025-02-13T16:02:40.875857801Z" level=info msg="Ensure that sandbox 5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd in task-service has been cleanup successfully" Feb 13 16:02:40.876376 containerd[1568]: time="2025-02-13T16:02:40.876028054Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:02:40.876376 containerd[1568]: time="2025-02-13T16:02:40.876038495Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:02:40.877025 containerd[1568]: time="2025-02-13T16:02:40.877008790Z" level=info msg="Ensure that sandbox e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010 in task-service has been cleanup successfully" Feb 13 16:02:40.877240 containerd[1568]: time="2025-02-13T16:02:40.877225742Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:02:40.877293 containerd[1568]: time="2025-02-13T16:02:40.877283442Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:02:40.877740 containerd[1568]: time="2025-02-13T16:02:40.877521254Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:1,}" Feb 13 16:02:40.878212 containerd[1568]: time="2025-02-13T16:02:40.878192692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:1,}" Feb 13 16:02:40.878263 kubelet[2825]: I0213 16:02:40.878254 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866" Feb 13 16:02:40.884858 containerd[1568]: time="2025-02-13T16:02:40.884829683Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:02:40.885067 containerd[1568]: time="2025-02-13T16:02:40.884948475Z" level=info msg="Ensure that sandbox 0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866 in task-service has been cleanup successfully" Feb 13 16:02:40.885752 containerd[1568]: time="2025-02-13T16:02:40.885191983Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:02:40.885752 containerd[1568]: time="2025-02-13T16:02:40.885203679Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:02:40.893630 containerd[1568]: time="2025-02-13T16:02:40.893587570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:1,}" Feb 13 16:02:40.895549 kubelet[2825]: I0213 16:02:40.895037 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2" Feb 13 16:02:40.896440 containerd[1568]: time="2025-02-13T16:02:40.896419679Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:02:40.896591 containerd[1568]: time="2025-02-13T16:02:40.896576479Z" level=info msg="Ensure that sandbox 8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2 in task-service has been cleanup successfully" Feb 13 16:02:40.900730 containerd[1568]: time="2025-02-13T16:02:40.900710732Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:02:40.900825 containerd[1568]: time="2025-02-13T16:02:40.900816286Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:02:40.901886 containerd[1568]: time="2025-02-13T16:02:40.901873782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:1,}" Feb 13 16:02:40.947695 containerd[1568]: time="2025-02-13T16:02:40.947455183Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:0,}" Feb 13 16:02:40.953774 containerd[1568]: time="2025-02-13T16:02:40.953748705Z" level=error msg="Failed to destroy network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.954567 containerd[1568]: time="2025-02-13T16:02:40.954546845Z" level=error msg="encountered an error cleaning up failed sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.954648 containerd[1568]: time="2025-02-13T16:02:40.954635595Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.955113 kubelet[2825]: E0213 16:02:40.954835 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.955113 kubelet[2825]: E0213 16:02:40.954871 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:40.955113 kubelet[2825]: E0213 16:02:40.954883 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:40.955291 kubelet[2825]: E0213 16:02:40.954936 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dt72t" podUID="68c42bcf-2d0d-494d-9abe-69e38a01bdd9" Feb 13 16:02:40.961343 containerd[1568]: time="2025-02-13T16:02:40.961310909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:0,}" Feb 13 16:02:40.975933 containerd[1568]: time="2025-02-13T16:02:40.975868353Z" level=error msg="Failed to destroy network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.976406 containerd[1568]: time="2025-02-13T16:02:40.976207235Z" level=error msg="encountered an error cleaning up failed sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.976406 containerd[1568]: time="2025-02-13T16:02:40.976242578Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.976772 kubelet[2825]: E0213 16:02:40.976545 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:40.976772 kubelet[2825]: E0213 16:02:40.976580 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:40.976772 kubelet[2825]: E0213 16:02:40.976594 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:40.977294 kubelet[2825]: E0213 16:02:40.976618 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:40.988271 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010-shm.mount: Deactivated successfully. Feb 13 16:02:40.988342 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2-shm.mount: Deactivated successfully. Feb 13 16:02:41.000856 containerd[1568]: time="2025-02-13T16:02:41.000514108Z" level=error msg="Failed to destroy network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.001965 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a-shm.mount: Deactivated successfully. Feb 13 16:02:41.002622 containerd[1568]: time="2025-02-13T16:02:41.002557680Z" level=error msg="encountered an error cleaning up failed sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.002660 containerd[1568]: time="2025-02-13T16:02:41.002650710Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.002825 kubelet[2825]: E0213 16:02:41.002807 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.003002 kubelet[2825]: E0213 16:02:41.002899 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:41.003002 kubelet[2825]: E0213 16:02:41.002954 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:41.003448 kubelet[2825]: E0213 16:02:41.002991 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" podUID="c05aa49b-8336-4728-802a-0e8f6d326479" Feb 13 16:02:41.007446 containerd[1568]: time="2025-02-13T16:02:41.007415694Z" level=error msg="Failed to destroy network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.007831 containerd[1568]: time="2025-02-13T16:02:41.007733362Z" level=error msg="encountered an error cleaning up failed sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.007831 containerd[1568]: time="2025-02-13T16:02:41.007774573Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.008133 kubelet[2825]: E0213 16:02:41.007976 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.008133 kubelet[2825]: E0213 16:02:41.008094 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:41.008133 kubelet[2825]: E0213 16:02:41.008112 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:41.008841 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5-shm.mount: Deactivated successfully. Feb 13 16:02:41.009091 kubelet[2825]: E0213 16:02:41.008876 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d2wxt" podUID="661d4ddb-a267-4645-9fad-e5fa882fa0db" Feb 13 16:02:41.036182 containerd[1568]: time="2025-02-13T16:02:41.036150098Z" level=error msg="Failed to destroy network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.037925 containerd[1568]: time="2025-02-13T16:02:41.037023247Z" level=error msg="encountered an error cleaning up failed sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.037925 containerd[1568]: time="2025-02-13T16:02:41.037061745Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.038016 kubelet[2825]: E0213 16:02:41.037665 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.038016 kubelet[2825]: E0213 16:02:41.037698 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:41.038016 kubelet[2825]: E0213 16:02:41.037710 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:41.038086 kubelet[2825]: E0213 16:02:41.037735 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" podUID="994d859a-3838-4bb4-a531-0bff4b1bbeaf" Feb 13 16:02:41.038118 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5-shm.mount: Deactivated successfully. Feb 13 16:02:41.042212 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68-shm.mount: Deactivated successfully. Feb 13 16:02:41.042516 kubelet[2825]: E0213 16:02:41.041199 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.042516 kubelet[2825]: E0213 16:02:41.041226 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:41.042516 kubelet[2825]: E0213 16:02:41.041260 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:41.042591 containerd[1568]: time="2025-02-13T16:02:41.040822013Z" level=error msg="Failed to destroy network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.042591 containerd[1568]: time="2025-02-13T16:02:41.041041598Z" level=error msg="encountered an error cleaning up failed sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.042591 containerd[1568]: time="2025-02-13T16:02:41.041072183Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:41.042670 kubelet[2825]: E0213 16:02:41.041346 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" podUID="ba8e0efe-98a0-4c27-9797-c702cafd7556" Feb 13 16:02:41.911461 kubelet[2825]: I0213 16:02:41.911394 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7" Feb 13 16:02:41.912246 containerd[1568]: time="2025-02-13T16:02:41.911989526Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:02:41.912246 containerd[1568]: time="2025-02-13T16:02:41.912139107Z" level=info msg="Ensure that sandbox a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7 in task-service has been cleanup successfully" Feb 13 16:02:41.912502 containerd[1568]: time="2025-02-13T16:02:41.912454488Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:02:41.912502 containerd[1568]: time="2025-02-13T16:02:41.912468282Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:02:41.912928 containerd[1568]: time="2025-02-13T16:02:41.912880775Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:02:41.912971 containerd[1568]: time="2025-02-13T16:02:41.912954941Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:02:41.912971 containerd[1568]: time="2025-02-13T16:02:41.912965871Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:02:41.913991 containerd[1568]: time="2025-02-13T16:02:41.913970146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:2,}" Feb 13 16:02:41.914744 kubelet[2825]: I0213 16:02:41.914533 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a" Feb 13 16:02:41.915289 containerd[1568]: time="2025-02-13T16:02:41.915258247Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:02:41.915487 containerd[1568]: time="2025-02-13T16:02:41.915473580Z" level=info msg="Ensure that sandbox 932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a in task-service has been cleanup successfully" Feb 13 16:02:41.915661 containerd[1568]: time="2025-02-13T16:02:41.915649609Z" level=info msg="TearDown network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" successfully" Feb 13 16:02:41.915827 containerd[1568]: time="2025-02-13T16:02:41.915707332Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" returns successfully" Feb 13 16:02:41.916252 containerd[1568]: time="2025-02-13T16:02:41.916078163Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:02:41.916252 containerd[1568]: time="2025-02-13T16:02:41.916130602Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:02:41.916252 containerd[1568]: time="2025-02-13T16:02:41.916138555Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:02:41.916721 containerd[1568]: time="2025-02-13T16:02:41.916478937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:2,}" Feb 13 16:02:41.917996 kubelet[2825]: I0213 16:02:41.917979 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5" Feb 13 16:02:41.918653 containerd[1568]: time="2025-02-13T16:02:41.918527571Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:02:41.918700 containerd[1568]: time="2025-02-13T16:02:41.918675848Z" level=info msg="Ensure that sandbox 198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5 in task-service has been cleanup successfully" Feb 13 16:02:41.919163 containerd[1568]: time="2025-02-13T16:02:41.919131828Z" level=info msg="TearDown network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" successfully" Feb 13 16:02:41.919163 containerd[1568]: time="2025-02-13T16:02:41.919143744Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" returns successfully" Feb 13 16:02:41.919724 containerd[1568]: time="2025-02-13T16:02:41.919540093Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:02:41.919724 containerd[1568]: time="2025-02-13T16:02:41.919587457Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:02:41.919724 containerd[1568]: time="2025-02-13T16:02:41.919595207Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:02:41.922453 containerd[1568]: time="2025-02-13T16:02:41.922432580Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:2,}" Feb 13 16:02:41.933876 kubelet[2825]: I0213 16:02:41.933697 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5" Feb 13 16:02:41.935799 containerd[1568]: time="2025-02-13T16:02:41.935769965Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:02:41.936000 containerd[1568]: time="2025-02-13T16:02:41.935986413Z" level=info msg="Ensure that sandbox a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5 in task-service has been cleanup successfully" Feb 13 16:02:41.936169 containerd[1568]: time="2025-02-13T16:02:41.936155344Z" level=info msg="TearDown network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" successfully" Feb 13 16:02:41.936169 containerd[1568]: time="2025-02-13T16:02:41.936166289Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" returns successfully" Feb 13 16:02:41.937541 containerd[1568]: time="2025-02-13T16:02:41.937497627Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:1,}" Feb 13 16:02:41.940650 kubelet[2825]: I0213 16:02:41.940419 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1" Feb 13 16:02:41.946783 kubelet[2825]: I0213 16:02:41.946762 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68" Feb 13 16:02:41.948247 containerd[1568]: time="2025-02-13T16:02:41.948042761Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:02:41.948247 containerd[1568]: time="2025-02-13T16:02:41.948170051Z" level=info msg="Ensure that sandbox 4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68 in task-service has been cleanup successfully" Feb 13 16:02:41.949529 containerd[1568]: time="2025-02-13T16:02:41.949516393Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:02:41.950437 containerd[1568]: time="2025-02-13T16:02:41.950214057Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:02:41.950437 containerd[1568]: time="2025-02-13T16:02:41.950226363Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:02:41.950659 containerd[1568]: time="2025-02-13T16:02:41.950568922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:1,}" Feb 13 16:02:41.950659 containerd[1568]: time="2025-02-13T16:02:41.950578580Z" level=info msg="Ensure that sandbox a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1 in task-service has been cleanup successfully" Feb 13 16:02:41.951396 containerd[1568]: time="2025-02-13T16:02:41.951385312Z" level=info msg="TearDown network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" successfully" Feb 13 16:02:41.951531 containerd[1568]: time="2025-02-13T16:02:41.951487473Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" returns successfully" Feb 13 16:02:41.952051 containerd[1568]: time="2025-02-13T16:02:41.952021581Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:02:41.952224 containerd[1568]: time="2025-02-13T16:02:41.952162591Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:02:41.952224 containerd[1568]: time="2025-02-13T16:02:41.952175507Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:02:41.952655 containerd[1568]: time="2025-02-13T16:02:41.952551429Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:2,}" Feb 13 16:02:41.985365 systemd[1]: run-netns-cni\x2d3e0da1d1\x2d4ab7\x2d73c3\x2d1530\x2dfca9ee6590f8.mount: Deactivated successfully. Feb 13 16:02:41.985426 systemd[1]: run-netns-cni\x2df73741ed\x2d9b3d\x2d773d\x2d81ed\x2dd1ab70f94193.mount: Deactivated successfully. Feb 13 16:02:41.985463 systemd[1]: run-netns-cni\x2d15763f3f\x2ddc70\x2d53d1\x2dceba\x2de8cb3af2f0ec.mount: Deactivated successfully. Feb 13 16:02:41.985498 systemd[1]: run-netns-cni\x2d7f148acb\x2db792\x2d5b84\x2da260\x2d0871735a42e5.mount: Deactivated successfully. Feb 13 16:02:41.985533 systemd[1]: run-netns-cni\x2dc686b063\x2d0ba1\x2de478\x2d4257\x2d11a149c3ebda.mount: Deactivated successfully. Feb 13 16:02:41.985565 systemd[1]: run-netns-cni\x2d1eb42e4e\x2d768f\x2dca6b\x2d5867\x2dee4e96e3cd95.mount: Deactivated successfully. Feb 13 16:02:42.040478 containerd[1568]: time="2025-02-13T16:02:42.040443048Z" level=error msg="Failed to destroy network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.042019 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4-shm.mount: Deactivated successfully. Feb 13 16:02:42.044606 containerd[1568]: time="2025-02-13T16:02:42.043227670Z" level=error msg="encountered an error cleaning up failed sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.044606 containerd[1568]: time="2025-02-13T16:02:42.043274199Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.044704 kubelet[2825]: E0213 16:02:42.043412 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.044704 kubelet[2825]: E0213 16:02:42.043451 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:42.044704 kubelet[2825]: E0213 16:02:42.043465 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:42.044895 kubelet[2825]: E0213 16:02:42.043489 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:42.052485 containerd[1568]: time="2025-02-13T16:02:42.052449409Z" level=error msg="Failed to destroy network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.053779 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c-shm.mount: Deactivated successfully. Feb 13 16:02:42.054207 containerd[1568]: time="2025-02-13T16:02:42.054182868Z" level=error msg="encountered an error cleaning up failed sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.054284 containerd[1568]: time="2025-02-13T16:02:42.054224396Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.054872 kubelet[2825]: E0213 16:02:42.054850 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.054953 kubelet[2825]: E0213 16:02:42.054886 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:42.055162 kubelet[2825]: E0213 16:02:42.055147 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:42.055474 kubelet[2825]: E0213 16:02:42.055183 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" podUID="994d859a-3838-4bb4-a531-0bff4b1bbeaf" Feb 13 16:02:42.068869 containerd[1568]: time="2025-02-13T16:02:42.068834601Z" level=error msg="Failed to destroy network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.069386 containerd[1568]: time="2025-02-13T16:02:42.069282400Z" level=error msg="encountered an error cleaning up failed sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.069478 containerd[1568]: time="2025-02-13T16:02:42.069465957Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.069918 kubelet[2825]: E0213 16:02:42.069745 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.069918 kubelet[2825]: E0213 16:02:42.069784 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:42.069918 kubelet[2825]: E0213 16:02:42.069797 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:42.070005 kubelet[2825]: E0213 16:02:42.069825 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" podUID="c05aa49b-8336-4728-802a-0e8f6d326479" Feb 13 16:02:42.072029 containerd[1568]: time="2025-02-13T16:02:42.071894377Z" level=error msg="Failed to destroy network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.073089 containerd[1568]: time="2025-02-13T16:02:42.073069407Z" level=error msg="encountered an error cleaning up failed sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.073146 containerd[1568]: time="2025-02-13T16:02:42.073127315Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.073273 kubelet[2825]: E0213 16:02:42.073253 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.073308 kubelet[2825]: E0213 16:02:42.073284 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:42.073308 kubelet[2825]: E0213 16:02:42.073295 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:42.073556 kubelet[2825]: E0213 16:02:42.073380 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d2wxt" podUID="661d4ddb-a267-4645-9fad-e5fa882fa0db" Feb 13 16:02:42.083064 containerd[1568]: time="2025-02-13T16:02:42.082985162Z" level=error msg="Failed to destroy network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.083211 containerd[1568]: time="2025-02-13T16:02:42.083038924Z" level=error msg="Failed to destroy network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.083324 containerd[1568]: time="2025-02-13T16:02:42.083239067Z" level=error msg="encountered an error cleaning up failed sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.083324 containerd[1568]: time="2025-02-13T16:02:42.083274668Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.083457 kubelet[2825]: E0213 16:02:42.083411 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.083457 kubelet[2825]: E0213 16:02:42.083448 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:42.083515 containerd[1568]: time="2025-02-13T16:02:42.083423163Z" level=error msg="encountered an error cleaning up failed sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.083540 kubelet[2825]: E0213 16:02:42.083461 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:42.083540 kubelet[2825]: E0213 16:02:42.083485 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dt72t" podUID="68c42bcf-2d0d-494d-9abe-69e38a01bdd9" Feb 13 16:02:42.083658 containerd[1568]: time="2025-02-13T16:02:42.083614935Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.084122 kubelet[2825]: E0213 16:02:42.084022 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:42.084122 kubelet[2825]: E0213 16:02:42.084060 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:42.084122 kubelet[2825]: E0213 16:02:42.084072 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:42.084212 kubelet[2825]: E0213 16:02:42.084094 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" podUID="ba8e0efe-98a0-4c27-9797-c702cafd7556" Feb 13 16:02:42.950080 kubelet[2825]: I0213 16:02:42.949941 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c" Feb 13 16:02:42.951426 containerd[1568]: time="2025-02-13T16:02:42.951319179Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" Feb 13 16:02:42.952058 containerd[1568]: time="2025-02-13T16:02:42.951655698Z" level=info msg="Ensure that sandbox a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c in task-service has been cleanup successfully" Feb 13 16:02:42.952058 containerd[1568]: time="2025-02-13T16:02:42.951896041Z" level=info msg="TearDown network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" successfully" Feb 13 16:02:42.952058 containerd[1568]: time="2025-02-13T16:02:42.951917310Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" returns successfully" Feb 13 16:02:42.952210 kubelet[2825]: I0213 16:02:42.951772 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2" Feb 13 16:02:42.952461 containerd[1568]: time="2025-02-13T16:02:42.952344922Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:02:42.952675 containerd[1568]: time="2025-02-13T16:02:42.952427143Z" level=info msg="TearDown network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" successfully" Feb 13 16:02:42.952801 containerd[1568]: time="2025-02-13T16:02:42.952731647Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" returns successfully" Feb 13 16:02:42.952801 containerd[1568]: time="2025-02-13T16:02:42.952540214Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:02:42.953049 containerd[1568]: time="2025-02-13T16:02:42.952914704Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:02:42.953049 containerd[1568]: time="2025-02-13T16:02:42.952970135Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:02:42.953049 containerd[1568]: time="2025-02-13T16:02:42.952979009Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:02:42.953222 containerd[1568]: time="2025-02-13T16:02:42.953145352Z" level=info msg="Ensure that sandbox f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2 in task-service has been cleanup successfully" Feb 13 16:02:42.953417 containerd[1568]: time="2025-02-13T16:02:42.953319445Z" level=info msg="TearDown network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" successfully" Feb 13 16:02:42.953417 containerd[1568]: time="2025-02-13T16:02:42.953344539Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" returns successfully" Feb 13 16:02:42.953417 containerd[1568]: time="2025-02-13T16:02:42.953363305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:3,}" Feb 13 16:02:42.954114 containerd[1568]: time="2025-02-13T16:02:42.953822216Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:02:42.954152 kubelet[2825]: I0213 16:02:42.954109 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4" Feb 13 16:02:42.954421 containerd[1568]: time="2025-02-13T16:02:42.954401882Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:02:42.954421 containerd[1568]: time="2025-02-13T16:02:42.954418040Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:02:42.954579 containerd[1568]: time="2025-02-13T16:02:42.954503549Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:02:42.954789 containerd[1568]: time="2025-02-13T16:02:42.954720004Z" level=info msg="Ensure that sandbox 348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4 in task-service has been cleanup successfully" Feb 13 16:02:42.954964 containerd[1568]: time="2025-02-13T16:02:42.954860115Z" level=info msg="TearDown network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" successfully" Feb 13 16:02:42.954964 containerd[1568]: time="2025-02-13T16:02:42.954873163Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" returns successfully" Feb 13 16:02:42.955204 containerd[1568]: time="2025-02-13T16:02:42.955192230Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:02:42.955268 containerd[1568]: time="2025-02-13T16:02:42.955239263Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:02:42.955268 containerd[1568]: time="2025-02-13T16:02:42.955252620Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:02:42.955331 containerd[1568]: time="2025-02-13T16:02:42.955318877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:2,}" Feb 13 16:02:42.955775 containerd[1568]: time="2025-02-13T16:02:42.955755846Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:02:42.955813 containerd[1568]: time="2025-02-13T16:02:42.955807545Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:02:42.955843 containerd[1568]: time="2025-02-13T16:02:42.955816061Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:02:42.956878 containerd[1568]: time="2025-02-13T16:02:42.956468974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:3,}" Feb 13 16:02:42.957057 kubelet[2825]: I0213 16:02:42.956995 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62" Feb 13 16:02:42.957597 containerd[1568]: time="2025-02-13T16:02:42.957303025Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" Feb 13 16:02:42.957826 containerd[1568]: time="2025-02-13T16:02:42.957781957Z" level=info msg="Ensure that sandbox 2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62 in task-service has been cleanup successfully" Feb 13 16:02:42.958529 containerd[1568]: time="2025-02-13T16:02:42.958510990Z" level=info msg="TearDown network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" successfully" Feb 13 16:02:42.958529 containerd[1568]: time="2025-02-13T16:02:42.958525057Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" returns successfully" Feb 13 16:02:42.958975 containerd[1568]: time="2025-02-13T16:02:42.958899422Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:02:42.958975 containerd[1568]: time="2025-02-13T16:02:42.958967201Z" level=info msg="TearDown network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" successfully" Feb 13 16:02:42.958975 containerd[1568]: time="2025-02-13T16:02:42.958975681Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" returns successfully" Feb 13 16:02:42.959714 kubelet[2825]: I0213 16:02:42.959398 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828" Feb 13 16:02:42.959769 containerd[1568]: time="2025-02-13T16:02:42.959416313Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:02:42.959769 containerd[1568]: time="2025-02-13T16:02:42.959470130Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:02:42.959769 containerd[1568]: time="2025-02-13T16:02:42.959481386Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:02:42.959769 containerd[1568]: time="2025-02-13T16:02:42.959666314Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" Feb 13 16:02:42.959869 containerd[1568]: time="2025-02-13T16:02:42.959770240Z" level=info msg="Ensure that sandbox 0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828 in task-service has been cleanup successfully" Feb 13 16:02:42.959957 containerd[1568]: time="2025-02-13T16:02:42.959928763Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:3,}" Feb 13 16:02:42.960122 containerd[1568]: time="2025-02-13T16:02:42.960107178Z" level=info msg="TearDown network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" successfully" Feb 13 16:02:42.960122 containerd[1568]: time="2025-02-13T16:02:42.960119475Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" returns successfully" Feb 13 16:02:42.960525 containerd[1568]: time="2025-02-13T16:02:42.960505929Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:02:42.960579 containerd[1568]: time="2025-02-13T16:02:42.960557159Z" level=info msg="TearDown network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" successfully" Feb 13 16:02:42.960579 containerd[1568]: time="2025-02-13T16:02:42.960570676Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" returns successfully" Feb 13 16:02:42.961007 containerd[1568]: time="2025-02-13T16:02:42.960825810Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:02:42.961007 containerd[1568]: time="2025-02-13T16:02:42.960877791Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:02:42.961007 containerd[1568]: time="2025-02-13T16:02:42.960885910Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:02:42.961189 kubelet[2825]: I0213 16:02:42.961142 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c" Feb 13 16:02:42.961575 containerd[1568]: time="2025-02-13T16:02:42.961439774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:3,}" Feb 13 16:02:42.961714 containerd[1568]: time="2025-02-13T16:02:42.961693968Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" Feb 13 16:02:42.962882 containerd[1568]: time="2025-02-13T16:02:42.962859604Z" level=info msg="Ensure that sandbox 611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c in task-service has been cleanup successfully" Feb 13 16:02:42.963025 containerd[1568]: time="2025-02-13T16:02:42.963002696Z" level=info msg="TearDown network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" successfully" Feb 13 16:02:42.963062 containerd[1568]: time="2025-02-13T16:02:42.963024065Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" returns successfully" Feb 13 16:02:42.963289 containerd[1568]: time="2025-02-13T16:02:42.963272162Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:02:42.963386 containerd[1568]: time="2025-02-13T16:02:42.963330230Z" level=info msg="TearDown network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" successfully" Feb 13 16:02:42.963386 containerd[1568]: time="2025-02-13T16:02:42.963345705Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" returns successfully" Feb 13 16:02:42.963747 containerd[1568]: time="2025-02-13T16:02:42.963728157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:2,}" Feb 13 16:02:42.980395 systemd[1]: run-netns-cni\x2d90a7a083\x2d44fe\x2df542\x2d73d4\x2df1681eeb864e.mount: Deactivated successfully. Feb 13 16:02:42.981966 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2-shm.mount: Deactivated successfully. Feb 13 16:02:42.982047 systemd[1]: run-netns-cni\x2d7ed61fa5\x2d0117\x2d9370\x2de821\x2da891fe52358d.mount: Deactivated successfully. Feb 13 16:02:42.982100 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c-shm.mount: Deactivated successfully. Feb 13 16:02:42.982184 systemd[1]: run-netns-cni\x2d6de4934c\x2d1758\x2d954a\x2da9b5\x2d70b8e4e6bf6b.mount: Deactivated successfully. Feb 13 16:02:42.982235 systemd[1]: run-netns-cni\x2d9d71e483\x2d5bf0\x2d77a0\x2d7051\x2d25bc71d7e1e4.mount: Deactivated successfully. Feb 13 16:02:42.982280 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828-shm.mount: Deactivated successfully. Feb 13 16:02:42.982330 systemd[1]: run-netns-cni\x2d5528aaf0\x2d1212\x2d474b\x2d680b\x2d0de7a1b5fb2a.mount: Deactivated successfully. Feb 13 16:02:42.982380 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62-shm.mount: Deactivated successfully. Feb 13 16:02:42.982429 systemd[1]: run-netns-cni\x2d7bf9f2a1\x2d8c25\x2d5e11\x2d81fd\x2d5fe97084a45c.mount: Deactivated successfully. Feb 13 16:02:43.514017 containerd[1568]: time="2025-02-13T16:02:43.513947207Z" level=error msg="Failed to destroy network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.514452 containerd[1568]: time="2025-02-13T16:02:43.514292518Z" level=error msg="encountered an error cleaning up failed sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.514452 containerd[1568]: time="2025-02-13T16:02:43.514332565Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.515346 kubelet[2825]: E0213 16:02:43.514457 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.515346 kubelet[2825]: E0213 16:02:43.514493 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:43.515346 kubelet[2825]: E0213 16:02:43.514507 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:43.515788 kubelet[2825]: E0213 16:02:43.514533 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" podUID="ba8e0efe-98a0-4c27-9797-c702cafd7556" Feb 13 16:02:43.545811 containerd[1568]: time="2025-02-13T16:02:43.545193517Z" level=error msg="Failed to destroy network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.546218 containerd[1568]: time="2025-02-13T16:02:43.546200612Z" level=error msg="encountered an error cleaning up failed sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.546271 containerd[1568]: time="2025-02-13T16:02:43.546238968Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.546414 kubelet[2825]: E0213 16:02:43.546379 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.546452 kubelet[2825]: E0213 16:02:43.546425 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:43.546452 kubelet[2825]: E0213 16:02:43.546441 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:43.546672 kubelet[2825]: E0213 16:02:43.546653 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:43.547688 containerd[1568]: time="2025-02-13T16:02:43.547663088Z" level=error msg="Failed to destroy network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.547954 containerd[1568]: time="2025-02-13T16:02:43.547941044Z" level=error msg="encountered an error cleaning up failed sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.548030 containerd[1568]: time="2025-02-13T16:02:43.548016160Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.548292 containerd[1568]: time="2025-02-13T16:02:43.548279183Z" level=error msg="Failed to destroy network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.548600 containerd[1568]: time="2025-02-13T16:02:43.548587136Z" level=error msg="encountered an error cleaning up failed sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.548670 containerd[1568]: time="2025-02-13T16:02:43.548657562Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.548720 kubelet[2825]: E0213 16:02:43.548693 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.548751 kubelet[2825]: E0213 16:02:43.548719 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:43.548779 kubelet[2825]: E0213 16:02:43.548733 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:43.549452 kubelet[2825]: E0213 16:02:43.548804 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" podUID="994d859a-3838-4bb4-a531-0bff4b1bbeaf" Feb 13 16:02:43.549452 kubelet[2825]: E0213 16:02:43.549102 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.549452 kubelet[2825]: E0213 16:02:43.549124 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:43.549551 kubelet[2825]: E0213 16:02:43.549134 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:43.549551 kubelet[2825]: E0213 16:02:43.549158 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dt72t" podUID="68c42bcf-2d0d-494d-9abe-69e38a01bdd9" Feb 13 16:02:43.552346 containerd[1568]: time="2025-02-13T16:02:43.552286321Z" level=error msg="Failed to destroy network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.552743 containerd[1568]: time="2025-02-13T16:02:43.552730190Z" level=error msg="encountered an error cleaning up failed sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.552935 containerd[1568]: time="2025-02-13T16:02:43.552921943Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.553235 containerd[1568]: time="2025-02-13T16:02:43.553204177Z" level=error msg="Failed to destroy network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.553696 kubelet[2825]: E0213 16:02:43.553518 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.553696 kubelet[2825]: E0213 16:02:43.553544 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:43.553696 kubelet[2825]: E0213 16:02:43.553563 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:43.553772 kubelet[2825]: E0213 16:02:43.553592 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d2wxt" podUID="661d4ddb-a267-4645-9fad-e5fa882fa0db" Feb 13 16:02:43.554436 containerd[1568]: time="2025-02-13T16:02:43.554364004Z" level=error msg="encountered an error cleaning up failed sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.554436 containerd[1568]: time="2025-02-13T16:02:43.554392585Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.554510 kubelet[2825]: E0213 16:02:43.554478 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:43.554510 kubelet[2825]: E0213 16:02:43.554501 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:43.554687 kubelet[2825]: E0213 16:02:43.554510 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:43.554687 kubelet[2825]: E0213 16:02:43.554556 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" podUID="c05aa49b-8336-4728-802a-0e8f6d326479" Feb 13 16:02:43.963737 kubelet[2825]: I0213 16:02:43.963717 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8" Feb 13 16:02:43.964536 containerd[1568]: time="2025-02-13T16:02:43.964508237Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" Feb 13 16:02:43.964824 containerd[1568]: time="2025-02-13T16:02:43.964811262Z" level=info msg="Ensure that sandbox 85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8 in task-service has been cleanup successfully" Feb 13 16:02:43.965325 containerd[1568]: time="2025-02-13T16:02:43.965237755Z" level=info msg="TearDown network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" successfully" Feb 13 16:02:43.965325 containerd[1568]: time="2025-02-13T16:02:43.965247967Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" returns successfully" Feb 13 16:02:43.965702 containerd[1568]: time="2025-02-13T16:02:43.965690937Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:02:43.965880 containerd[1568]: time="2025-02-13T16:02:43.965846580Z" level=info msg="TearDown network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" successfully" Feb 13 16:02:43.965880 containerd[1568]: time="2025-02-13T16:02:43.965855340Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" returns successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.967028034Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.967070820Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.967077138Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.967326451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:3,}" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968260924Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\"" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968360253Z" level=info msg="Ensure that sandbox ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9 in task-service has been cleanup successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968479894Z" level=info msg="TearDown network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968488024Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" returns successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968666308Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968711281Z" level=info msg="TearDown network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968718322Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" returns successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.968972139Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.969107168Z" level=info msg="TearDown network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.969116277Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" returns successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.969534306Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.969630702Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:02:43.971534 containerd[1568]: time="2025-02-13T16:02:43.969638209Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:02:43.971875 kubelet[2825]: I0213 16:02:43.967983 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9" Feb 13 16:02:43.972522 kubelet[2825]: I0213 16:02:43.972487 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf" Feb 13 16:02:43.973277 containerd[1568]: time="2025-02-13T16:02:43.973106745Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" Feb 13 16:02:43.973930 containerd[1568]: time="2025-02-13T16:02:43.973721635Z" level=info msg="Ensure that sandbox 6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf in task-service has been cleanup successfully" Feb 13 16:02:43.974684 containerd[1568]: time="2025-02-13T16:02:43.974668087Z" level=info msg="TearDown network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" successfully" Feb 13 16:02:43.974684 containerd[1568]: time="2025-02-13T16:02:43.974682211Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" returns successfully" Feb 13 16:02:43.974869 containerd[1568]: time="2025-02-13T16:02:43.974833368Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:02:43.974894 containerd[1568]: time="2025-02-13T16:02:43.974876514Z" level=info msg="TearDown network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" successfully" Feb 13 16:02:43.974894 containerd[1568]: time="2025-02-13T16:02:43.974885479Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" returns successfully" Feb 13 16:02:43.975474 containerd[1568]: time="2025-02-13T16:02:43.975377226Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:02:43.975474 containerd[1568]: time="2025-02-13T16:02:43.975419376Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:02:43.975474 containerd[1568]: time="2025-02-13T16:02:43.975449953Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:02:43.975808 containerd[1568]: time="2025-02-13T16:02:43.975794322Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:02:43.976145 containerd[1568]: time="2025-02-13T16:02:43.975837451Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:02:43.976145 containerd[1568]: time="2025-02-13T16:02:43.975847093Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:02:43.976991 containerd[1568]: time="2025-02-13T16:02:43.976971981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:4,}" Feb 13 16:02:43.977573 kubelet[2825]: I0213 16:02:43.977227 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9" Feb 13 16:02:43.978240 containerd[1568]: time="2025-02-13T16:02:43.978138216Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\"" Feb 13 16:02:43.981694 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9-shm.mount: Deactivated successfully. Feb 13 16:02:43.981836 systemd[1]: run-netns-cni\x2db73fcfcc\x2dfd27\x2d1893\x2d958a\x2dc947c88a11a0.mount: Deactivated successfully. Feb 13 16:02:43.981875 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8-shm.mount: Deactivated successfully. Feb 13 16:02:43.983263 containerd[1568]: time="2025-02-13T16:02:43.983247257Z" level=info msg="Ensure that sandbox 9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9 in task-service has been cleanup successfully" Feb 13 16:02:43.985060 containerd[1568]: time="2025-02-13T16:02:43.984109335Z" level=info msg="TearDown network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" successfully" Feb 13 16:02:43.985060 containerd[1568]: time="2025-02-13T16:02:43.984120176Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" returns successfully" Feb 13 16:02:43.986612 containerd[1568]: time="2025-02-13T16:02:43.986599390Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" Feb 13 16:02:43.986723 containerd[1568]: time="2025-02-13T16:02:43.986707243Z" level=info msg="TearDown network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" successfully" Feb 13 16:02:43.986777 containerd[1568]: time="2025-02-13T16:02:43.986769187Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" returns successfully" Feb 13 16:02:43.987228 systemd[1]: run-netns-cni\x2ddf53df4c\x2d21df\x2d7899\x2db667\x2d7ee4c6115a17.mount: Deactivated successfully. Feb 13 16:02:43.988428 containerd[1568]: time="2025-02-13T16:02:43.988413816Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:02:43.988528 containerd[1568]: time="2025-02-13T16:02:43.988519285Z" level=info msg="TearDown network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" successfully" Feb 13 16:02:43.988593 containerd[1568]: time="2025-02-13T16:02:43.988580969Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" returns successfully" Feb 13 16:02:43.990771 containerd[1568]: time="2025-02-13T16:02:43.990711900Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:02:43.990992 containerd[1568]: time="2025-02-13T16:02:43.990961928Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:02:43.991049 containerd[1568]: time="2025-02-13T16:02:43.991039642Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:02:43.991957 kubelet[2825]: I0213 16:02:43.991941 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47" Feb 13 16:02:43.994834 containerd[1568]: time="2025-02-13T16:02:43.994813240Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:4,}" Feb 13 16:02:43.995329 containerd[1568]: time="2025-02-13T16:02:43.995317612Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\"" Feb 13 16:02:43.995820 containerd[1568]: time="2025-02-13T16:02:43.995809319Z" level=info msg="Ensure that sandbox 347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47 in task-service has been cleanup successfully" Feb 13 16:02:43.996182 containerd[1568]: time="2025-02-13T16:02:43.996171931Z" level=info msg="TearDown network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" successfully" Feb 13 16:02:44.002673 containerd[1568]: time="2025-02-13T16:02:43.996582338Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" returns successfully" Feb 13 16:02:43.998966 systemd[1]: run-netns-cni\x2d250f786f\x2d4014\x2d467a\x2dbadd\x2d3f053724d162.mount: Deactivated successfully. Feb 13 16:02:44.003162 containerd[1568]: time="2025-02-13T16:02:44.003146870Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:4,}" Feb 13 16:02:44.013544 containerd[1568]: time="2025-02-13T16:02:44.013524260Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" Feb 13 16:02:44.013670 containerd[1568]: time="2025-02-13T16:02:44.013640216Z" level=info msg="TearDown network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" successfully" Feb 13 16:02:44.013776 containerd[1568]: time="2025-02-13T16:02:44.013767655Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" returns successfully" Feb 13 16:02:44.014739 containerd[1568]: time="2025-02-13T16:02:44.014729186Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:02:44.014821 containerd[1568]: time="2025-02-13T16:02:44.014813170Z" level=info msg="TearDown network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" successfully" Feb 13 16:02:44.014869 containerd[1568]: time="2025-02-13T16:02:44.014861172Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" returns successfully" Feb 13 16:02:44.015054 kubelet[2825]: I0213 16:02:44.015012 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6" Feb 13 16:02:44.015600 containerd[1568]: time="2025-02-13T16:02:44.015507904Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\"" Feb 13 16:02:44.016357 containerd[1568]: time="2025-02-13T16:02:44.016222433Z" level=info msg="Ensure that sandbox dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6 in task-service has been cleanup successfully" Feb 13 16:02:44.016790 containerd[1568]: time="2025-02-13T16:02:44.016773685Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:02:44.017783 containerd[1568]: time="2025-02-13T16:02:44.016827577Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:02:44.017783 containerd[1568]: time="2025-02-13T16:02:44.016837967Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:02:44.017783 containerd[1568]: time="2025-02-13T16:02:44.016873702Z" level=info msg="TearDown network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" successfully" Feb 13 16:02:44.017783 containerd[1568]: time="2025-02-13T16:02:44.016879545Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" returns successfully" Feb 13 16:02:44.017783 containerd[1568]: time="2025-02-13T16:02:44.017394879Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" Feb 13 16:02:44.017783 containerd[1568]: time="2025-02-13T16:02:44.017433026Z" level=info msg="TearDown network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" successfully" Feb 13 16:02:44.017783 containerd[1568]: time="2025-02-13T16:02:44.017439139Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" returns successfully" Feb 13 16:02:44.018253 containerd[1568]: time="2025-02-13T16:02:44.018237458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:4,}" Feb 13 16:02:44.018954 containerd[1568]: time="2025-02-13T16:02:44.018937336Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:02:44.018996 containerd[1568]: time="2025-02-13T16:02:44.018983107Z" level=info msg="TearDown network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" successfully" Feb 13 16:02:44.019094 containerd[1568]: time="2025-02-13T16:02:44.019079996Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" returns successfully" Feb 13 16:02:44.020070 systemd[1]: run-netns-cni\x2df31c67aa\x2d6ee8\x2d2623\x2d2401\x2d9385d36ce807.mount: Deactivated successfully. Feb 13 16:02:44.020612 containerd[1568]: time="2025-02-13T16:02:44.020600345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:3,}" Feb 13 16:02:45.936385 containerd[1568]: time="2025-02-13T16:02:45.936310113Z" level=error msg="Failed to destroy network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:45.937290 containerd[1568]: time="2025-02-13T16:02:45.937157490Z" level=error msg="encountered an error cleaning up failed sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:45.937290 containerd[1568]: time="2025-02-13T16:02:45.937196909Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:45.938048 kubelet[2825]: E0213 16:02:45.937450 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:45.938048 kubelet[2825]: E0213 16:02:45.937486 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:45.938048 kubelet[2825]: E0213 16:02:45.937501 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:45.938242 kubelet[2825]: E0213 16:02:45.937529 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" podUID="ba8e0efe-98a0-4c27-9797-c702cafd7556" Feb 13 16:02:46.002771 containerd[1568]: time="2025-02-13T16:02:46.002624830Z" level=error msg="Failed to destroy network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.003059 containerd[1568]: time="2025-02-13T16:02:46.003045013Z" level=error msg="encountered an error cleaning up failed sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.003260 containerd[1568]: time="2025-02-13T16:02:46.003214639Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.003526 kubelet[2825]: E0213 16:02:46.003464 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.003526 kubelet[2825]: E0213 16:02:46.003508 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:46.003526 kubelet[2825]: E0213 16:02:46.003525 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:46.004183 kubelet[2825]: E0213 16:02:46.003554 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:46.091141 containerd[1568]: time="2025-02-13T16:02:46.090103252Z" level=error msg="Failed to destroy network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.091141 containerd[1568]: time="2025-02-13T16:02:46.090326769Z" level=error msg="encountered an error cleaning up failed sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.091141 containerd[1568]: time="2025-02-13T16:02:46.090366871Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.091347 kubelet[2825]: E0213 16:02:46.090500 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.091347 kubelet[2825]: E0213 16:02:46.090539 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:46.091347 kubelet[2825]: E0213 16:02:46.090553 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:46.091414 kubelet[2825]: E0213 16:02:46.090576 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dt72t" podUID="68c42bcf-2d0d-494d-9abe-69e38a01bdd9" Feb 13 16:02:46.098714 containerd[1568]: time="2025-02-13T16:02:46.097328940Z" level=error msg="Failed to destroy network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.098714 containerd[1568]: time="2025-02-13T16:02:46.097525478Z" level=error msg="encountered an error cleaning up failed sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.098714 containerd[1568]: time="2025-02-13T16:02:46.097570360Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.109458 kubelet[2825]: E0213 16:02:46.097708 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.109458 kubelet[2825]: E0213 16:02:46.097744 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:46.109458 kubelet[2825]: E0213 16:02:46.097868 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:46.109563 containerd[1568]: time="2025-02-13T16:02:46.102854773Z" level=error msg="Failed to destroy network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.109563 containerd[1568]: time="2025-02-13T16:02:46.103050116Z" level=error msg="encountered an error cleaning up failed sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.109563 containerd[1568]: time="2025-02-13T16:02:46.103085062Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.109678 kubelet[2825]: E0213 16:02:46.097895 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d2wxt" podUID="661d4ddb-a267-4645-9fad-e5fa882fa0db" Feb 13 16:02:46.109678 kubelet[2825]: E0213 16:02:46.103213 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.109678 kubelet[2825]: E0213 16:02:46.103246 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:46.109775 kubelet[2825]: E0213 16:02:46.103257 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:46.109775 kubelet[2825]: E0213 16:02:46.103280 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" podUID="994d859a-3838-4bb4-a531-0bff4b1bbeaf" Feb 13 16:02:46.115814 containerd[1568]: time="2025-02-13T16:02:46.115784865Z" level=error msg="Failed to destroy network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.116326 containerd[1568]: time="2025-02-13T16:02:46.116228029Z" level=error msg="encountered an error cleaning up failed sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.116326 containerd[1568]: time="2025-02-13T16:02:46.116264796Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.116418 kubelet[2825]: E0213 16:02:46.116402 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.116445 kubelet[2825]: E0213 16:02:46.116435 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:46.116466 kubelet[2825]: E0213 16:02:46.116448 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:46.116483 kubelet[2825]: E0213 16:02:46.116472 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" podUID="c05aa49b-8336-4728-802a-0e8f6d326479" Feb 13 16:02:46.146620 kubelet[2825]: I0213 16:02:46.146150 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff" Feb 13 16:02:46.147443 containerd[1568]: time="2025-02-13T16:02:46.147422747Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\"" Feb 13 16:02:46.147684 containerd[1568]: time="2025-02-13T16:02:46.147672833Z" level=info msg="Ensure that sandbox db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff in task-service has been cleanup successfully" Feb 13 16:02:46.148046 containerd[1568]: time="2025-02-13T16:02:46.148030855Z" level=info msg="TearDown network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" successfully" Feb 13 16:02:46.148095 containerd[1568]: time="2025-02-13T16:02:46.148087452Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" returns successfully" Feb 13 16:02:46.149712 containerd[1568]: time="2025-02-13T16:02:46.149450527Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" Feb 13 16:02:46.149712 containerd[1568]: time="2025-02-13T16:02:46.149519027Z" level=info msg="TearDown network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" successfully" Feb 13 16:02:46.149712 containerd[1568]: time="2025-02-13T16:02:46.149548714Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" returns successfully" Feb 13 16:02:46.150629 containerd[1568]: time="2025-02-13T16:02:46.150610628Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:02:46.150676 containerd[1568]: time="2025-02-13T16:02:46.150664927Z" level=info msg="TearDown network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" successfully" Feb 13 16:02:46.150699 containerd[1568]: time="2025-02-13T16:02:46.150673930Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" returns successfully" Feb 13 16:02:46.150898 containerd[1568]: time="2025-02-13T16:02:46.150887033Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:02:46.151605 containerd[1568]: time="2025-02-13T16:02:46.151595128Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:02:46.151683 containerd[1568]: time="2025-02-13T16:02:46.151674263Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:02:46.153210 containerd[1568]: time="2025-02-13T16:02:46.152925732Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:02:46.153210 containerd[1568]: time="2025-02-13T16:02:46.152971730Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:02:46.153210 containerd[1568]: time="2025-02-13T16:02:46.152982774Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:02:46.153368 containerd[1568]: time="2025-02-13T16:02:46.153354121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:5,}" Feb 13 16:02:46.153944 kubelet[2825]: I0213 16:02:46.153508 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88" Feb 13 16:02:46.154743 containerd[1568]: time="2025-02-13T16:02:46.154717176Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\"" Feb 13 16:02:46.154865 containerd[1568]: time="2025-02-13T16:02:46.154851620Z" level=info msg="Ensure that sandbox b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88 in task-service has been cleanup successfully" Feb 13 16:02:46.165285 containerd[1568]: time="2025-02-13T16:02:46.165262167Z" level=info msg="TearDown network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" successfully" Feb 13 16:02:46.165285 containerd[1568]: time="2025-02-13T16:02:46.165280574Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" returns successfully" Feb 13 16:02:46.166104 containerd[1568]: time="2025-02-13T16:02:46.166090959Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" Feb 13 16:02:46.166171 containerd[1568]: time="2025-02-13T16:02:46.166141416Z" level=info msg="TearDown network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" successfully" Feb 13 16:02:46.166259 containerd[1568]: time="2025-02-13T16:02:46.166173997Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" returns successfully" Feb 13 16:02:46.166541 containerd[1568]: time="2025-02-13T16:02:46.166524014Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:02:46.166867 containerd[1568]: time="2025-02-13T16:02:46.166690759Z" level=info msg="TearDown network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" successfully" Feb 13 16:02:46.166867 containerd[1568]: time="2025-02-13T16:02:46.166698581Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" returns successfully" Feb 13 16:02:46.167275 containerd[1568]: time="2025-02-13T16:02:46.167253247Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:02:46.167305 containerd[1568]: time="2025-02-13T16:02:46.167294892Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:02:46.167305 containerd[1568]: time="2025-02-13T16:02:46.167300995Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:02:46.167605 containerd[1568]: time="2025-02-13T16:02:46.167554972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:4,}" Feb 13 16:02:46.237219 containerd[1568]: time="2025-02-13T16:02:46.237009769Z" level=error msg="Failed to destroy network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.238388 containerd[1568]: time="2025-02-13T16:02:46.237373740Z" level=error msg="encountered an error cleaning up failed sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.238388 containerd[1568]: time="2025-02-13T16:02:46.237410734Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.238436 kubelet[2825]: E0213 16:02:46.237732 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.238436 kubelet[2825]: E0213 16:02:46.237766 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:46.238436 kubelet[2825]: E0213 16:02:46.237779 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:46.238499 kubelet[2825]: E0213 16:02:46.238045 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:46.247966 containerd[1568]: time="2025-02-13T16:02:46.247935490Z" level=error msg="Failed to destroy network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.248448 containerd[1568]: time="2025-02-13T16:02:46.248430973Z" level=error msg="encountered an error cleaning up failed sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.248486 containerd[1568]: time="2025-02-13T16:02:46.248472648Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.248687 kubelet[2825]: E0213 16:02:46.248612 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:46.248687 kubelet[2825]: E0213 16:02:46.248653 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:46.248687 kubelet[2825]: E0213 16:02:46.248666 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:46.248897 kubelet[2825]: E0213 16:02:46.248790 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" podUID="ba8e0efe-98a0-4c27-9797-c702cafd7556" Feb 13 16:02:46.817640 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015-shm.mount: Deactivated successfully. Feb 13 16:02:46.817697 systemd[1]: run-netns-cni\x2d120471a2\x2d40fe\x2d4de4\x2d9683\x2dcd45230eaf44.mount: Deactivated successfully. Feb 13 16:02:46.817735 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff-shm.mount: Deactivated successfully. Feb 13 16:02:46.817775 systemd[1]: run-netns-cni\x2d82fadb53\x2df1b3\x2d122b\x2d1d58\x2d8af82580c235.mount: Deactivated successfully. Feb 13 16:02:46.817809 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88-shm.mount: Deactivated successfully. Feb 13 16:02:46.917411 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4194366713.mount: Deactivated successfully. Feb 13 16:02:47.071324 containerd[1568]: time="2025-02-13T16:02:47.067294541Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:47.097091 containerd[1568]: time="2025-02-13T16:02:47.096996390Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 16:02:47.106600 containerd[1568]: time="2025-02-13T16:02:47.106573803Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:47.121598 containerd[1568]: time="2025-02-13T16:02:47.121570616Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:47.122858 containerd[1568]: time="2025-02-13T16:02:47.122840751Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 6.256592199s" Feb 13 16:02:47.122899 containerd[1568]: time="2025-02-13T16:02:47.122862406Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 16:02:47.156633 kubelet[2825]: I0213 16:02:47.156443 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e" Feb 13 16:02:47.157522 containerd[1568]: time="2025-02-13T16:02:47.157369454Z" level=info msg="StopPodSandbox for \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\"" Feb 13 16:02:47.158552 containerd[1568]: time="2025-02-13T16:02:47.158165157Z" level=info msg="Ensure that sandbox 424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e in task-service has been cleanup successfully" Feb 13 16:02:47.159775 systemd[1]: run-netns-cni\x2d665315c4\x2d5ed1\x2db8e4\x2db343\x2dc3259d7438c6.mount: Deactivated successfully. Feb 13 16:02:47.160289 containerd[1568]: time="2025-02-13T16:02:47.159835131Z" level=info msg="CreateContainer within sandbox \"7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 16:02:47.161355 containerd[1568]: time="2025-02-13T16:02:47.160991859Z" level=info msg="TearDown network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" successfully" Feb 13 16:02:47.161355 containerd[1568]: time="2025-02-13T16:02:47.161005726Z" level=info msg="StopPodSandbox for \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" returns successfully" Feb 13 16:02:47.161869 containerd[1568]: time="2025-02-13T16:02:47.161666446Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\"" Feb 13 16:02:47.161869 containerd[1568]: time="2025-02-13T16:02:47.161723920Z" level=info msg="TearDown network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" successfully" Feb 13 16:02:47.161869 containerd[1568]: time="2025-02-13T16:02:47.161731426Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" returns successfully" Feb 13 16:02:47.162129 containerd[1568]: time="2025-02-13T16:02:47.162114315Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" Feb 13 16:02:47.162260 containerd[1568]: time="2025-02-13T16:02:47.162197612Z" level=info msg="TearDown network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" successfully" Feb 13 16:02:47.162260 containerd[1568]: time="2025-02-13T16:02:47.162206201Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" returns successfully" Feb 13 16:02:47.162538 containerd[1568]: time="2025-02-13T16:02:47.162423271Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:02:47.162538 containerd[1568]: time="2025-02-13T16:02:47.162477041Z" level=info msg="TearDown network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" successfully" Feb 13 16:02:47.162538 containerd[1568]: time="2025-02-13T16:02:47.162484312Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" returns successfully" Feb 13 16:02:47.162730 containerd[1568]: time="2025-02-13T16:02:47.162720717Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:02:47.162846 containerd[1568]: time="2025-02-13T16:02:47.162797728Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:02:47.162846 containerd[1568]: time="2025-02-13T16:02:47.162804999Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:02:47.163177 containerd[1568]: time="2025-02-13T16:02:47.163166668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:5,}" Feb 13 16:02:47.164519 kubelet[2825]: I0213 16:02:47.163857 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e" Feb 13 16:02:47.164788 containerd[1568]: time="2025-02-13T16:02:47.164775794Z" level=info msg="StopPodSandbox for \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\"" Feb 13 16:02:47.166357 containerd[1568]: time="2025-02-13T16:02:47.166329243Z" level=info msg="Ensure that sandbox 1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e in task-service has been cleanup successfully" Feb 13 16:02:47.169268 containerd[1568]: time="2025-02-13T16:02:47.169254821Z" level=info msg="TearDown network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" successfully" Feb 13 16:02:47.169414 containerd[1568]: time="2025-02-13T16:02:47.169328726Z" level=info msg="StopPodSandbox for \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" returns successfully" Feb 13 16:02:47.169630 containerd[1568]: time="2025-02-13T16:02:47.169570746Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\"" Feb 13 16:02:47.169708 containerd[1568]: time="2025-02-13T16:02:47.169617336Z" level=info msg="TearDown network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" successfully" Feb 13 16:02:47.169708 containerd[1568]: time="2025-02-13T16:02:47.169671883Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" returns successfully" Feb 13 16:02:47.170069 containerd[1568]: time="2025-02-13T16:02:47.170044208Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" Feb 13 16:02:47.170217 containerd[1568]: time="2025-02-13T16:02:47.170135910Z" level=info msg="TearDown network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" successfully" Feb 13 16:02:47.170217 containerd[1568]: time="2025-02-13T16:02:47.170144282Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" returns successfully" Feb 13 16:02:47.170529 containerd[1568]: time="2025-02-13T16:02:47.170379119Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:02:47.170529 containerd[1568]: time="2025-02-13T16:02:47.170416573Z" level=info msg="TearDown network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" successfully" Feb 13 16:02:47.170529 containerd[1568]: time="2025-02-13T16:02:47.170431772Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" returns successfully" Feb 13 16:02:47.171572 containerd[1568]: time="2025-02-13T16:02:47.171562282Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:02:47.172882 containerd[1568]: time="2025-02-13T16:02:47.171969195Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:02:47.172882 containerd[1568]: time="2025-02-13T16:02:47.171978685Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:02:47.172882 containerd[1568]: time="2025-02-13T16:02:47.172189106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:5,}" Feb 13 16:02:47.173501 kubelet[2825]: I0213 16:02:47.173484 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9" Feb 13 16:02:47.175784 containerd[1568]: time="2025-02-13T16:02:47.175359950Z" level=info msg="StopPodSandbox for \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\"" Feb 13 16:02:47.175784 containerd[1568]: time="2025-02-13T16:02:47.175469619Z" level=info msg="Ensure that sandbox 91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9 in task-service has been cleanup successfully" Feb 13 16:02:47.178057 containerd[1568]: time="2025-02-13T16:02:47.176685826Z" level=info msg="TearDown network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" successfully" Feb 13 16:02:47.178057 containerd[1568]: time="2025-02-13T16:02:47.176700147Z" level=info msg="StopPodSandbox for \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" returns successfully" Feb 13 16:02:47.178708 containerd[1568]: time="2025-02-13T16:02:47.178687094Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\"" Feb 13 16:02:47.178739 containerd[1568]: time="2025-02-13T16:02:47.178732367Z" level=info msg="TearDown network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" successfully" Feb 13 16:02:47.178766 containerd[1568]: time="2025-02-13T16:02:47.178738719Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" returns successfully" Feb 13 16:02:47.179628 containerd[1568]: time="2025-02-13T16:02:47.179397757Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" Feb 13 16:02:47.179628 containerd[1568]: time="2025-02-13T16:02:47.179450225Z" level=info msg="TearDown network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" successfully" Feb 13 16:02:47.179628 containerd[1568]: time="2025-02-13T16:02:47.179458875Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" returns successfully" Feb 13 16:02:47.180181 containerd[1568]: time="2025-02-13T16:02:47.180168062Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:02:47.180252 containerd[1568]: time="2025-02-13T16:02:47.180223656Z" level=info msg="TearDown network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" successfully" Feb 13 16:02:47.180275 containerd[1568]: time="2025-02-13T16:02:47.180251413Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" returns successfully" Feb 13 16:02:47.180940 containerd[1568]: time="2025-02-13T16:02:47.180840268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:4,}" Feb 13 16:02:47.181330 kubelet[2825]: I0213 16:02:47.181118 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015" Feb 13 16:02:47.184236 containerd[1568]: time="2025-02-13T16:02:47.184215391Z" level=info msg="StopPodSandbox for \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\"" Feb 13 16:02:47.184863 containerd[1568]: time="2025-02-13T16:02:47.184838962Z" level=info msg="Ensure that sandbox 98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015 in task-service has been cleanup successfully" Feb 13 16:02:47.185442 containerd[1568]: time="2025-02-13T16:02:47.185135339Z" level=info msg="TearDown network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" successfully" Feb 13 16:02:47.185442 containerd[1568]: time="2025-02-13T16:02:47.185147289Z" level=info msg="StopPodSandbox for \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" returns successfully" Feb 13 16:02:47.185703 containerd[1568]: time="2025-02-13T16:02:47.185623366Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\"" Feb 13 16:02:47.185735 containerd[1568]: time="2025-02-13T16:02:47.185705303Z" level=info msg="TearDown network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" successfully" Feb 13 16:02:47.185735 containerd[1568]: time="2025-02-13T16:02:47.185713355Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" returns successfully" Feb 13 16:02:47.186411 containerd[1568]: time="2025-02-13T16:02:47.186159207Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" Feb 13 16:02:47.186411 containerd[1568]: time="2025-02-13T16:02:47.186206957Z" level=info msg="TearDown network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" successfully" Feb 13 16:02:47.186411 containerd[1568]: time="2025-02-13T16:02:47.186214235Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" returns successfully" Feb 13 16:02:47.187086 containerd[1568]: time="2025-02-13T16:02:47.186580236Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:02:47.187086 containerd[1568]: time="2025-02-13T16:02:47.186618964Z" level=info msg="TearDown network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" successfully" Feb 13 16:02:47.187086 containerd[1568]: time="2025-02-13T16:02:47.186625191Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" returns successfully" Feb 13 16:02:47.187755 containerd[1568]: time="2025-02-13T16:02:47.187237325Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:02:47.187755 containerd[1568]: time="2025-02-13T16:02:47.187284828Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:02:47.187755 containerd[1568]: time="2025-02-13T16:02:47.187290825Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:02:47.188656 containerd[1568]: time="2025-02-13T16:02:47.188605836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:5,}" Feb 13 16:02:47.189051 kubelet[2825]: I0213 16:02:47.189003 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4" Feb 13 16:02:47.190024 containerd[1568]: time="2025-02-13T16:02:47.189771779Z" level=info msg="StopPodSandbox for \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\"" Feb 13 16:02:47.190024 containerd[1568]: time="2025-02-13T16:02:47.189883022Z" level=info msg="Ensure that sandbox d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4 in task-service has been cleanup successfully" Feb 13 16:02:47.192547 containerd[1568]: time="2025-02-13T16:02:47.192529382Z" level=info msg="TearDown network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" successfully" Feb 13 16:02:47.192678 containerd[1568]: time="2025-02-13T16:02:47.192594657Z" level=info msg="StopPodSandbox for \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" returns successfully" Feb 13 16:02:47.193670 containerd[1568]: time="2025-02-13T16:02:47.193658135Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\"" Feb 13 16:02:47.193883 containerd[1568]: time="2025-02-13T16:02:47.193762526Z" level=info msg="TearDown network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" successfully" Feb 13 16:02:47.193883 containerd[1568]: time="2025-02-13T16:02:47.193772396Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" returns successfully" Feb 13 16:02:47.194243 containerd[1568]: time="2025-02-13T16:02:47.194182375Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" Feb 13 16:02:47.194243 containerd[1568]: time="2025-02-13T16:02:47.194226681Z" level=info msg="TearDown network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" successfully" Feb 13 16:02:47.194243 containerd[1568]: time="2025-02-13T16:02:47.194233579Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" returns successfully" Feb 13 16:02:47.196135 containerd[1568]: time="2025-02-13T16:02:47.195991226Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:02:47.196135 containerd[1568]: time="2025-02-13T16:02:47.196037750Z" level=info msg="TearDown network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" successfully" Feb 13 16:02:47.196135 containerd[1568]: time="2025-02-13T16:02:47.196045180Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" returns successfully" Feb 13 16:02:47.197526 containerd[1568]: time="2025-02-13T16:02:47.197452909Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:02:47.197589 containerd[1568]: time="2025-02-13T16:02:47.197575662Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:02:47.197589 containerd[1568]: time="2025-02-13T16:02:47.197586018Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:02:47.206324 containerd[1568]: time="2025-02-13T16:02:47.206245712Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:5,}" Feb 13 16:02:47.207105 kubelet[2825]: I0213 16:02:47.207088 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910" Feb 13 16:02:47.213123 containerd[1568]: time="2025-02-13T16:02:47.213057169Z" level=info msg="StopPodSandbox for \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\"" Feb 13 16:02:47.213369 containerd[1568]: time="2025-02-13T16:02:47.213345116Z" level=info msg="Ensure that sandbox 1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910 in task-service has been cleanup successfully" Feb 13 16:02:47.213554 containerd[1568]: time="2025-02-13T16:02:47.213535178Z" level=info msg="TearDown network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" successfully" Feb 13 16:02:47.213700 containerd[1568]: time="2025-02-13T16:02:47.213545241Z" level=info msg="StopPodSandbox for \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" returns successfully" Feb 13 16:02:47.215701 containerd[1568]: time="2025-02-13T16:02:47.215685474Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\"" Feb 13 16:02:47.216402 containerd[1568]: time="2025-02-13T16:02:47.216391433Z" level=info msg="TearDown network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" successfully" Feb 13 16:02:47.216467 containerd[1568]: time="2025-02-13T16:02:47.216451960Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" returns successfully" Feb 13 16:02:47.218467 containerd[1568]: time="2025-02-13T16:02:47.218450469Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" Feb 13 16:02:47.218570 containerd[1568]: time="2025-02-13T16:02:47.218561761Z" level=info msg="TearDown network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" successfully" Feb 13 16:02:47.218766 containerd[1568]: time="2025-02-13T16:02:47.218613687Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" returns successfully" Feb 13 16:02:47.220412 containerd[1568]: time="2025-02-13T16:02:47.218894734Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:02:47.220412 containerd[1568]: time="2025-02-13T16:02:47.218956191Z" level=info msg="TearDown network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" successfully" Feb 13 16:02:47.220412 containerd[1568]: time="2025-02-13T16:02:47.218963021Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" returns successfully" Feb 13 16:02:47.221691 containerd[1568]: time="2025-02-13T16:02:47.221434128Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:02:47.221691 containerd[1568]: time="2025-02-13T16:02:47.221481807Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:02:47.221691 containerd[1568]: time="2025-02-13T16:02:47.221506911Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:02:47.223532 containerd[1568]: time="2025-02-13T16:02:47.223519188Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:02:47.223651 containerd[1568]: time="2025-02-13T16:02:47.223610258Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:02:47.223651 containerd[1568]: time="2025-02-13T16:02:47.223618996Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:02:47.225486 containerd[1568]: time="2025-02-13T16:02:47.225411015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:6,}" Feb 13 16:02:47.260795 containerd[1568]: time="2025-02-13T16:02:47.260753085Z" level=error msg="Failed to destroy network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.262224 containerd[1568]: time="2025-02-13T16:02:47.260979872Z" level=error msg="encountered an error cleaning up failed sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.262224 containerd[1568]: time="2025-02-13T16:02:47.261568924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.262316 kubelet[2825]: E0213 16:02:47.261713 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.262316 kubelet[2825]: E0213 16:02:47.261755 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:47.262316 kubelet[2825]: E0213 16:02:47.261768 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" Feb 13 16:02:47.262400 kubelet[2825]: E0213 16:02:47.261800 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-cd99f8587-nvllq_calico-system(c05aa49b-8336-4728-802a-0e8f6d326479)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" podUID="c05aa49b-8336-4728-802a-0e8f6d326479" Feb 13 16:02:47.267650 containerd[1568]: time="2025-02-13T16:02:47.267621206Z" level=info msg="CreateContainer within sandbox \"7a79473507266677a47766d15a646f7c27c30f0601bb71e638649aaf20c733ff\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"838a240a5aaf84b1c19acbfc8da4e5e9832e8f2121b7d44484cb5c853454ff43\"" Feb 13 16:02:47.279985 containerd[1568]: time="2025-02-13T16:02:47.279962742Z" level=info msg="StartContainer for \"838a240a5aaf84b1c19acbfc8da4e5e9832e8f2121b7d44484cb5c853454ff43\"" Feb 13 16:02:47.339982 containerd[1568]: time="2025-02-13T16:02:47.339829998Z" level=error msg="Failed to destroy network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.340784 containerd[1568]: time="2025-02-13T16:02:47.340328046Z" level=error msg="encountered an error cleaning up failed sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.340784 containerd[1568]: time="2025-02-13T16:02:47.340370414Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.340987 kubelet[2825]: E0213 16:02:47.340505 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.340987 kubelet[2825]: E0213 16:02:47.340537 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:47.340987 kubelet[2825]: E0213 16:02:47.340552 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-d2wxt" Feb 13 16:02:47.341544 kubelet[2825]: E0213 16:02:47.340590 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-d2wxt_kube-system(661d4ddb-a267-4645-9fad-e5fa882fa0db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-d2wxt" podUID="661d4ddb-a267-4645-9fad-e5fa882fa0db" Feb 13 16:02:47.354346 containerd[1568]: time="2025-02-13T16:02:47.354317410Z" level=error msg="Failed to destroy network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.354652 containerd[1568]: time="2025-02-13T16:02:47.354635930Z" level=error msg="encountered an error cleaning up failed sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.365965 containerd[1568]: time="2025-02-13T16:02:47.365856166Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.366081 kubelet[2825]: E0213 16:02:47.366035 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.366081 kubelet[2825]: E0213 16:02:47.366072 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:47.366138 kubelet[2825]: E0213 16:02:47.366084 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" Feb 13 16:02:47.366138 kubelet[2825]: E0213 16:02:47.366111 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-rqk9c_calico-apiserver(994d859a-3838-4bb4-a531-0bff4b1bbeaf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" podUID="994d859a-3838-4bb4-a531-0bff4b1bbeaf" Feb 13 16:02:47.368694 containerd[1568]: time="2025-02-13T16:02:47.368477515Z" level=error msg="Failed to destroy network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.368791 containerd[1568]: time="2025-02-13T16:02:47.368777824Z" level=error msg="encountered an error cleaning up failed sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.368934 containerd[1568]: time="2025-02-13T16:02:47.368889060Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.369103 kubelet[2825]: E0213 16:02:47.369078 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.369162 kubelet[2825]: E0213 16:02:47.369120 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:47.369162 kubelet[2825]: E0213 16:02:47.369132 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-dt72t" Feb 13 16:02:47.369230 kubelet[2825]: E0213 16:02:47.369156 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-dt72t_kube-system(68c42bcf-2d0d-494d-9abe-69e38a01bdd9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-dt72t" podUID="68c42bcf-2d0d-494d-9abe-69e38a01bdd9" Feb 13 16:02:47.380455 containerd[1568]: time="2025-02-13T16:02:47.380394294Z" level=error msg="Failed to destroy network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.380626 containerd[1568]: time="2025-02-13T16:02:47.380604658Z" level=error msg="encountered an error cleaning up failed sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.380662 containerd[1568]: time="2025-02-13T16:02:47.380645180Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.380792 containerd[1568]: time="2025-02-13T16:02:47.380756908Z" level=error msg="Failed to destroy network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.380879 kubelet[2825]: E0213 16:02:47.380807 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.380879 kubelet[2825]: E0213 16:02:47.380852 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:47.380879 kubelet[2825]: E0213 16:02:47.380866 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cv4n7" Feb 13 16:02:47.381529 containerd[1568]: time="2025-02-13T16:02:47.381057721Z" level=error msg="encountered an error cleaning up failed sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.381529 containerd[1568]: time="2025-02-13T16:02:47.381083577Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.381608 kubelet[2825]: E0213 16:02:47.380897 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cv4n7_calico-system(fb1149e0-8e00-49ff-a8bd-416370ecd365)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cv4n7" podUID="fb1149e0-8e00-49ff-a8bd-416370ecd365" Feb 13 16:02:47.381608 kubelet[2825]: E0213 16:02:47.381411 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:02:47.381608 kubelet[2825]: E0213 16:02:47.381433 2825 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:47.381684 kubelet[2825]: E0213 16:02:47.381450 2825 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" Feb 13 16:02:47.381684 kubelet[2825]: E0213 16:02:47.381467 2825 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-64877df8f5-fjq62_calico-apiserver(ba8e0efe-98a0-4c27-9797-c702cafd7556)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" podUID="ba8e0efe-98a0-4c27-9797-c702cafd7556" Feb 13 16:02:47.432043 systemd[1]: Started cri-containerd-838a240a5aaf84b1c19acbfc8da4e5e9832e8f2121b7d44484cb5c853454ff43.scope - libcontainer container 838a240a5aaf84b1c19acbfc8da4e5e9832e8f2121b7d44484cb5c853454ff43. Feb 13 16:02:47.458216 containerd[1568]: time="2025-02-13T16:02:47.458185586Z" level=info msg="StartContainer for \"838a240a5aaf84b1c19acbfc8da4e5e9832e8f2121b7d44484cb5c853454ff43\" returns successfully" Feb 13 16:02:47.549008 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 16:02:47.549651 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 16:02:47.822184 systemd[1]: run-netns-cni\x2d7e3170e3\x2d0a1e\x2d507c\x2dc32e\x2d6cbfeb3ecef4.mount: Deactivated successfully. Feb 13 16:02:47.822250 systemd[1]: run-netns-cni\x2d479fd139\x2dbacb\x2d4bb4\x2d199b\x2df2360fd01832.mount: Deactivated successfully. Feb 13 16:02:47.822289 systemd[1]: run-netns-cni\x2d556843fd\x2d4f73\x2de238\x2d2169\x2d522a117bfff3.mount: Deactivated successfully. Feb 13 16:02:47.822323 systemd[1]: run-netns-cni\x2d38d5837b\x2d27ec\x2dfaf2\x2db2f5\x2da8dc3ad97c8a.mount: Deactivated successfully. Feb 13 16:02:47.822368 systemd[1]: run-netns-cni\x2d7f772fbe\x2d86e1\x2d9cfa\x2df693\x2d44071b3ab53c.mount: Deactivated successfully. Feb 13 16:02:48.214599 kubelet[2825]: I0213 16:02:48.213331 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b" Feb 13 16:02:48.215310 containerd[1568]: time="2025-02-13T16:02:48.213718078Z" level=info msg="StopPodSandbox for \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\"" Feb 13 16:02:48.215310 containerd[1568]: time="2025-02-13T16:02:48.213847986Z" level=info msg="Ensure that sandbox e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b in task-service has been cleanup successfully" Feb 13 16:02:48.217412 containerd[1568]: time="2025-02-13T16:02:48.215347645Z" level=info msg="TearDown network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\" successfully" Feb 13 16:02:48.217412 containerd[1568]: time="2025-02-13T16:02:48.215361120Z" level=info msg="StopPodSandbox for \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\" returns successfully" Feb 13 16:02:48.216729 systemd[1]: run-netns-cni\x2d88ea3a80\x2d84b9\x2d1cde\x2da20b\x2d998ea9f4ccec.mount: Deactivated successfully. Feb 13 16:02:48.218463 containerd[1568]: time="2025-02-13T16:02:48.218430770Z" level=info msg="StopPodSandbox for \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\"" Feb 13 16:02:48.218517 containerd[1568]: time="2025-02-13T16:02:48.218502375Z" level=info msg="TearDown network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" successfully" Feb 13 16:02:48.218517 containerd[1568]: time="2025-02-13T16:02:48.218509836Z" level=info msg="StopPodSandbox for \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" returns successfully" Feb 13 16:02:48.220978 containerd[1568]: time="2025-02-13T16:02:48.220835117Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\"" Feb 13 16:02:48.220978 containerd[1568]: time="2025-02-13T16:02:48.220925239Z" level=info msg="TearDown network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" successfully" Feb 13 16:02:48.220978 containerd[1568]: time="2025-02-13T16:02:48.220934254Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" returns successfully" Feb 13 16:02:48.221299 containerd[1568]: time="2025-02-13T16:02:48.221284145Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" Feb 13 16:02:48.221344 containerd[1568]: time="2025-02-13T16:02:48.221332573Z" level=info msg="TearDown network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" successfully" Feb 13 16:02:48.221344 containerd[1568]: time="2025-02-13T16:02:48.221341047Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" returns successfully" Feb 13 16:02:48.222509 containerd[1568]: time="2025-02-13T16:02:48.222294345Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:02:48.222586 containerd[1568]: time="2025-02-13T16:02:48.222481478Z" level=info msg="TearDown network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" successfully" Feb 13 16:02:48.222586 containerd[1568]: time="2025-02-13T16:02:48.222580187Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" returns successfully" Feb 13 16:02:48.243438 containerd[1568]: time="2025-02-13T16:02:48.243167282Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:02:48.243438 containerd[1568]: time="2025-02-13T16:02:48.243250331Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:02:48.243438 containerd[1568]: time="2025-02-13T16:02:48.243259483Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:02:48.243564 containerd[1568]: time="2025-02-13T16:02:48.243541346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:6,}" Feb 13 16:02:48.244589 kubelet[2825]: I0213 16:02:48.244571 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5" Feb 13 16:02:48.246785 containerd[1568]: time="2025-02-13T16:02:48.246345715Z" level=info msg="StopPodSandbox for \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\"" Feb 13 16:02:48.247705 containerd[1568]: time="2025-02-13T16:02:48.246574990Z" level=info msg="Ensure that sandbox bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5 in task-service has been cleanup successfully" Feb 13 16:02:48.247705 containerd[1568]: time="2025-02-13T16:02:48.247645334Z" level=info msg="TearDown network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\" successfully" Feb 13 16:02:48.247705 containerd[1568]: time="2025-02-13T16:02:48.247666672Z" level=info msg="StopPodSandbox for \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\" returns successfully" Feb 13 16:02:48.249189 systemd[1]: run-netns-cni\x2d2280d8f5\x2dc2c4\x2d8d49\x2d6e70\x2dd6bf8d66b3da.mount: Deactivated successfully. Feb 13 16:02:48.250618 containerd[1568]: time="2025-02-13T16:02:48.250596464Z" level=info msg="StopPodSandbox for \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\"" Feb 13 16:02:48.250680 containerd[1568]: time="2025-02-13T16:02:48.250666883Z" level=info msg="TearDown network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" successfully" Feb 13 16:02:48.250680 containerd[1568]: time="2025-02-13T16:02:48.250678491Z" level=info msg="StopPodSandbox for \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" returns successfully" Feb 13 16:02:48.251916 containerd[1568]: time="2025-02-13T16:02:48.251634093Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\"" Feb 13 16:02:48.251916 containerd[1568]: time="2025-02-13T16:02:48.251687502Z" level=info msg="TearDown network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" successfully" Feb 13 16:02:48.251916 containerd[1568]: time="2025-02-13T16:02:48.251694029Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" returns successfully" Feb 13 16:02:48.253292 containerd[1568]: time="2025-02-13T16:02:48.252351360Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" Feb 13 16:02:48.253292 containerd[1568]: time="2025-02-13T16:02:48.252401696Z" level=info msg="TearDown network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" successfully" Feb 13 16:02:48.253292 containerd[1568]: time="2025-02-13T16:02:48.252409365Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" returns successfully" Feb 13 16:02:48.253292 containerd[1568]: time="2025-02-13T16:02:48.253084298Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:02:48.253292 containerd[1568]: time="2025-02-13T16:02:48.253128113Z" level=info msg="TearDown network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" successfully" Feb 13 16:02:48.253292 containerd[1568]: time="2025-02-13T16:02:48.253134636Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" returns successfully" Feb 13 16:02:48.253601 containerd[1568]: time="2025-02-13T16:02:48.253590908Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:02:48.253721 containerd[1568]: time="2025-02-13T16:02:48.253670923Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:02:48.253759 containerd[1568]: time="2025-02-13T16:02:48.253752489Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:02:48.254390 containerd[1568]: time="2025-02-13T16:02:48.254292985Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:6,}" Feb 13 16:02:48.254799 kubelet[2825]: I0213 16:02:48.243597 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-gr9pp" podStartSLOduration=1.5068637950000001 podStartE2EDuration="19.222530676s" podCreationTimestamp="2025-02-13 16:02:29 +0000 UTC" firstStartedPulling="2025-02-13 16:02:29.416184379 +0000 UTC m=+11.953538785" lastFinishedPulling="2025-02-13 16:02:47.13185126 +0000 UTC m=+29.669205666" observedRunningTime="2025-02-13 16:02:48.220995227 +0000 UTC m=+30.758349642" watchObservedRunningTime="2025-02-13 16:02:48.222530676 +0000 UTC m=+30.759885103" Feb 13 16:02:48.255568 kubelet[2825]: I0213 16:02:48.255547 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e" Feb 13 16:02:48.257580 containerd[1568]: time="2025-02-13T16:02:48.256036924Z" level=info msg="StopPodSandbox for \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\"" Feb 13 16:02:48.257580 containerd[1568]: time="2025-02-13T16:02:48.256155527Z" level=info msg="Ensure that sandbox ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e in task-service has been cleanup successfully" Feb 13 16:02:48.259098 systemd[1]: run-netns-cni\x2dcbc331b0\x2dad58\x2d2311\x2df336\x2d8dc167b8ad87.mount: Deactivated successfully. Feb 13 16:02:48.260891 containerd[1568]: time="2025-02-13T16:02:48.260402767Z" level=info msg="TearDown network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\" successfully" Feb 13 16:02:48.260891 containerd[1568]: time="2025-02-13T16:02:48.260419129Z" level=info msg="StopPodSandbox for \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\" returns successfully" Feb 13 16:02:48.261769 containerd[1568]: time="2025-02-13T16:02:48.261699367Z" level=info msg="StopPodSandbox for \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\"" Feb 13 16:02:48.261980 containerd[1568]: time="2025-02-13T16:02:48.261872094Z" level=info msg="TearDown network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" successfully" Feb 13 16:02:48.261980 containerd[1568]: time="2025-02-13T16:02:48.261883632Z" level=info msg="StopPodSandbox for \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" returns successfully" Feb 13 16:02:48.262848 containerd[1568]: time="2025-02-13T16:02:48.262460736Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\"" Feb 13 16:02:48.262848 containerd[1568]: time="2025-02-13T16:02:48.262504203Z" level=info msg="TearDown network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" successfully" Feb 13 16:02:48.262848 containerd[1568]: time="2025-02-13T16:02:48.262510256Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" returns successfully" Feb 13 16:02:48.263704 containerd[1568]: time="2025-02-13T16:02:48.263553592Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" Feb 13 16:02:48.263704 containerd[1568]: time="2025-02-13T16:02:48.263594585Z" level=info msg="TearDown network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" successfully" Feb 13 16:02:48.263704 containerd[1568]: time="2025-02-13T16:02:48.263600756Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" returns successfully" Feb 13 16:02:48.264496 containerd[1568]: time="2025-02-13T16:02:48.264351591Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:02:48.264496 containerd[1568]: time="2025-02-13T16:02:48.264395052Z" level=info msg="TearDown network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" successfully" Feb 13 16:02:48.264496 containerd[1568]: time="2025-02-13T16:02:48.264401367Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" returns successfully" Feb 13 16:02:48.264826 containerd[1568]: time="2025-02-13T16:02:48.264702247Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:02:48.264826 containerd[1568]: time="2025-02-13T16:02:48.264755849Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:02:48.264826 containerd[1568]: time="2025-02-13T16:02:48.264763502Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:02:48.266482 kubelet[2825]: I0213 16:02:48.265579 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084" Feb 13 16:02:48.266808 containerd[1568]: time="2025-02-13T16:02:48.266686840Z" level=info msg="StopPodSandbox for \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\"" Feb 13 16:02:48.266975 containerd[1568]: time="2025-02-13T16:02:48.266942346Z" level=info msg="Ensure that sandbox 041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084 in task-service has been cleanup successfully" Feb 13 16:02:48.272314 containerd[1568]: time="2025-02-13T16:02:48.272289753Z" level=info msg="TearDown network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\" successfully" Feb 13 16:02:48.272314 containerd[1568]: time="2025-02-13T16:02:48.272308189Z" level=info msg="StopPodSandbox for \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\" returns successfully" Feb 13 16:02:48.272411 containerd[1568]: time="2025-02-13T16:02:48.267115156Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:02:48.272411 containerd[1568]: time="2025-02-13T16:02:48.272391372Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:02:48.272411 containerd[1568]: time="2025-02-13T16:02:48.272397928Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:02:48.272931 containerd[1568]: time="2025-02-13T16:02:48.272677962Z" level=info msg="StopPodSandbox for \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\"" Feb 13 16:02:48.272931 containerd[1568]: time="2025-02-13T16:02:48.272721254Z" level=info msg="TearDown network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" successfully" Feb 13 16:02:48.272931 containerd[1568]: time="2025-02-13T16:02:48.272727284Z" level=info msg="StopPodSandbox for \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" returns successfully" Feb 13 16:02:48.272931 containerd[1568]: time="2025-02-13T16:02:48.272816669Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:7,}" Feb 13 16:02:48.275706 containerd[1568]: time="2025-02-13T16:02:48.275610295Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\"" Feb 13 16:02:48.275706 containerd[1568]: time="2025-02-13T16:02:48.275694444Z" level=info msg="TearDown network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" successfully" Feb 13 16:02:48.275706 containerd[1568]: time="2025-02-13T16:02:48.275703403Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" returns successfully" Feb 13 16:02:48.277281 containerd[1568]: time="2025-02-13T16:02:48.277128181Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" Feb 13 16:02:48.277281 containerd[1568]: time="2025-02-13T16:02:48.277174510Z" level=info msg="TearDown network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" successfully" Feb 13 16:02:48.277281 containerd[1568]: time="2025-02-13T16:02:48.277181056Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" returns successfully" Feb 13 16:02:48.279106 containerd[1568]: time="2025-02-13T16:02:48.278717323Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:02:48.279106 containerd[1568]: time="2025-02-13T16:02:48.278766252Z" level=info msg="TearDown network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" successfully" Feb 13 16:02:48.279106 containerd[1568]: time="2025-02-13T16:02:48.278798473Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" returns successfully" Feb 13 16:02:48.281280 containerd[1568]: time="2025-02-13T16:02:48.281244172Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:02:48.281478 containerd[1568]: time="2025-02-13T16:02:48.281315602Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:02:48.281794 containerd[1568]: time="2025-02-13T16:02:48.281327392Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:02:48.283179 containerd[1568]: time="2025-02-13T16:02:48.283030371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:6,}" Feb 13 16:02:48.290418 kubelet[2825]: I0213 16:02:48.290400 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd" Feb 13 16:02:48.292503 containerd[1568]: time="2025-02-13T16:02:48.292010952Z" level=info msg="StopPodSandbox for \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\"" Feb 13 16:02:48.292503 containerd[1568]: time="2025-02-13T16:02:48.292141178Z" level=info msg="Ensure that sandbox 3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd in task-service has been cleanup successfully" Feb 13 16:02:48.292503 containerd[1568]: time="2025-02-13T16:02:48.292275202Z" level=info msg="TearDown network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\" successfully" Feb 13 16:02:48.292503 containerd[1568]: time="2025-02-13T16:02:48.292285966Z" level=info msg="StopPodSandbox for \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\" returns successfully" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.292710570Z" level=info msg="StopPodSandbox for \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\"" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.292770411Z" level=info msg="TearDown network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" successfully" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.292778146Z" level=info msg="StopPodSandbox for \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" returns successfully" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.293033950Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\"" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.293091148Z" level=info msg="TearDown network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" successfully" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.293098682Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" returns successfully" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.293504494Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" Feb 13 16:02:48.294198 containerd[1568]: time="2025-02-13T16:02:48.293543419Z" level=info msg="TearDown network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" successfully" Feb 13 16:02:48.294428 kubelet[2825]: I0213 16:02:48.294408 2825 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a" Feb 13 16:02:48.295486 containerd[1568]: time="2025-02-13T16:02:48.293550886Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" returns successfully" Feb 13 16:02:48.295486 containerd[1568]: time="2025-02-13T16:02:48.294841074Z" level=info msg="StopPodSandbox for \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\"" Feb 13 16:02:48.298235 containerd[1568]: time="2025-02-13T16:02:48.298192934Z" level=info msg="Ensure that sandbox c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a in task-service has been cleanup successfully" Feb 13 16:02:48.298400 containerd[1568]: time="2025-02-13T16:02:48.298385996Z" level=info msg="TearDown network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\" successfully" Feb 13 16:02:48.298400 containerd[1568]: time="2025-02-13T16:02:48.298398153Z" level=info msg="StopPodSandbox for \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\" returns successfully" Feb 13 16:02:48.299362 containerd[1568]: time="2025-02-13T16:02:48.299346957Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:02:48.299425 containerd[1568]: time="2025-02-13T16:02:48.299410009Z" level=info msg="TearDown network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" successfully" Feb 13 16:02:48.299425 containerd[1568]: time="2025-02-13T16:02:48.299419566Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" returns successfully" Feb 13 16:02:48.299579 containerd[1568]: time="2025-02-13T16:02:48.299564561Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:02:48.299620 containerd[1568]: time="2025-02-13T16:02:48.299608983Z" level=info msg="StopPodSandbox for \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\"" Feb 13 16:02:48.299650 containerd[1568]: time="2025-02-13T16:02:48.299644530Z" level=info msg="TearDown network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" successfully" Feb 13 16:02:48.299671 containerd[1568]: time="2025-02-13T16:02:48.299650483Z" level=info msg="StopPodSandbox for \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" returns successfully" Feb 13 16:02:48.299792 containerd[1568]: time="2025-02-13T16:02:48.299778200Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:02:48.299792 containerd[1568]: time="2025-02-13T16:02:48.299787227Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:02:48.300542 containerd[1568]: time="2025-02-13T16:02:48.300172692Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\"" Feb 13 16:02:48.300542 containerd[1568]: time="2025-02-13T16:02:48.300211319Z" level=info msg="TearDown network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" successfully" Feb 13 16:02:48.300542 containerd[1568]: time="2025-02-13T16:02:48.300217149Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" returns successfully" Feb 13 16:02:48.300542 containerd[1568]: time="2025-02-13T16:02:48.300336726Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" Feb 13 16:02:48.300542 containerd[1568]: time="2025-02-13T16:02:48.300393767Z" level=info msg="TearDown network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" successfully" Feb 13 16:02:48.300542 containerd[1568]: time="2025-02-13T16:02:48.300402273Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" returns successfully" Feb 13 16:02:48.300542 containerd[1568]: time="2025-02-13T16:02:48.300460564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:6,}" Feb 13 16:02:48.301137 containerd[1568]: time="2025-02-13T16:02:48.300816575Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:02:48.301137 containerd[1568]: time="2025-02-13T16:02:48.300861405Z" level=info msg="TearDown network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" successfully" Feb 13 16:02:48.301137 containerd[1568]: time="2025-02-13T16:02:48.300867403Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" returns successfully" Feb 13 16:02:48.301977 containerd[1568]: time="2025-02-13T16:02:48.301830290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:5,}" Feb 13 16:02:48.762028 systemd-networkd[1271]: cali3ffea42fa42: Link UP Feb 13 16:02:48.762134 systemd-networkd[1271]: cali3ffea42fa42: Gained carrier Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.415 [INFO][4838] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.428 [INFO][4838] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0 calico-apiserver-64877df8f5- calico-apiserver 994d859a-3838-4bb4-a531-0bff4b1bbeaf 667 0 2025-02-13 16:02:28 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64877df8f5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64877df8f5-rqk9c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3ffea42fa42 [] []}} ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.428 [INFO][4838] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.716 [INFO][4863] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" HandleID="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Workload="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4863] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" HandleID="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Workload="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b47e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64877df8f5-rqk9c", "timestamp":"2025-02-13 16:02:48.716325207 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4863] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.735 [INFO][4863] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.735 [INFO][4863] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.736 [INFO][4863] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.742 [INFO][4863] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.744 [INFO][4863] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.745 [INFO][4863] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.747 [INFO][4863] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.747 [INFO][4863] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.747 [INFO][4863] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87 Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.749 [INFO][4863] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.752 [INFO][4863] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.752 [INFO][4863] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" host="localhost" Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.753 [INFO][4863] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:02:48.773035 containerd[1568]: 2025-02-13 16:02:48.753 [INFO][4863] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" HandleID="k8s-pod-network.1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Workload="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" Feb 13 16:02:48.773813 containerd[1568]: 2025-02-13 16:02:48.754 [INFO][4838] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0", GenerateName:"calico-apiserver-64877df8f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"994d859a-3838-4bb4-a531-0bff4b1bbeaf", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64877df8f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64877df8f5-rqk9c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3ffea42fa42", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:48.773813 containerd[1568]: 2025-02-13 16:02:48.754 [INFO][4838] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" Feb 13 16:02:48.773813 containerd[1568]: 2025-02-13 16:02:48.754 [INFO][4838] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3ffea42fa42 ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" Feb 13 16:02:48.773813 containerd[1568]: 2025-02-13 16:02:48.761 [INFO][4838] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" Feb 13 16:02:48.773813 containerd[1568]: 2025-02-13 16:02:48.762 [INFO][4838] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0", GenerateName:"calico-apiserver-64877df8f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"994d859a-3838-4bb4-a531-0bff4b1bbeaf", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64877df8f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87", Pod:"calico-apiserver-64877df8f5-rqk9c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3ffea42fa42", MAC:"c6:6e:04:34:4d:ca", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:48.773813 containerd[1568]: 2025-02-13 16:02:48.771 [INFO][4838] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-rqk9c" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--rqk9c-eth0" Feb 13 16:02:48.805669 containerd[1568]: time="2025-02-13T16:02:48.805456716Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:48.805669 containerd[1568]: time="2025-02-13T16:02:48.805494711Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:48.805669 containerd[1568]: time="2025-02-13T16:02:48.805503305Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:48.805669 containerd[1568]: time="2025-02-13T16:02:48.805551189Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:48.817988 systemd[1]: Started cri-containerd-1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87.scope - libcontainer container 1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87. Feb 13 16:02:48.824718 systemd[1]: run-netns-cni\x2ddd86a960\x2d6bfa\x2de02f\x2da55a\x2df04be1a33cb4.mount: Deactivated successfully. Feb 13 16:02:48.824779 systemd[1]: run-netns-cni\x2d310325e1\x2d96af\x2d2c27\x2d77f7\x2d549fddb1c323.mount: Deactivated successfully. Feb 13 16:02:48.824833 systemd[1]: run-netns-cni\x2d8668668b\x2daeb7\x2d6e63\x2dfb39\x2dba893cd89f52.mount: Deactivated successfully. Feb 13 16:02:48.829184 systemd-resolved[1487]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:02:48.854468 containerd[1568]: time="2025-02-13T16:02:48.854300332Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-rqk9c,Uid:994d859a-3838-4bb4-a531-0bff4b1bbeaf,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87\"" Feb 13 16:02:48.858652 containerd[1568]: time="2025-02-13T16:02:48.858506402Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 16:02:48.865978 systemd-networkd[1271]: calib93bb6d70fc: Link UP Feb 13 16:02:48.866409 systemd-networkd[1271]: calib93bb6d70fc: Gained carrier Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.328 [INFO][4781] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.388 [INFO][4781] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0 calico-apiserver-64877df8f5- calico-apiserver ba8e0efe-98a0-4c27-9797-c702cafd7556 665 0 2025-02-13 16:02:28 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:64877df8f5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-64877df8f5-fjq62 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib93bb6d70fc [] []}} ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.388 [INFO][4781] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.716 [INFO][4855] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" HandleID="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Workload="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4855] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" HandleID="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Workload="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002787c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-64877df8f5-fjq62", "timestamp":"2025-02-13 16:02:48.71641597 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4855] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.752 [INFO][4855] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.752 [INFO][4855] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.837 [INFO][4855] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.839 [INFO][4855] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.849 [INFO][4855] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.850 [INFO][4855] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.852 [INFO][4855] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.852 [INFO][4855] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.853 [INFO][4855] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8 Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.857 [INFO][4855] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.861 [INFO][4855] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.861 [INFO][4855] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" host="localhost" Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.861 [INFO][4855] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:02:48.879927 containerd[1568]: 2025-02-13 16:02:48.861 [INFO][4855] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" HandleID="k8s-pod-network.1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Workload="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" Feb 13 16:02:48.880390 containerd[1568]: 2025-02-13 16:02:48.863 [INFO][4781] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0", GenerateName:"calico-apiserver-64877df8f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba8e0efe-98a0-4c27-9797-c702cafd7556", ResourceVersion:"665", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64877df8f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-64877df8f5-fjq62", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib93bb6d70fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:48.880390 containerd[1568]: 2025-02-13 16:02:48.863 [INFO][4781] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" Feb 13 16:02:48.880390 containerd[1568]: 2025-02-13 16:02:48.863 [INFO][4781] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib93bb6d70fc ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" Feb 13 16:02:48.880390 containerd[1568]: 2025-02-13 16:02:48.866 [INFO][4781] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" Feb 13 16:02:48.880390 containerd[1568]: 2025-02-13 16:02:48.867 [INFO][4781] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0", GenerateName:"calico-apiserver-64877df8f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"ba8e0efe-98a0-4c27-9797-c702cafd7556", ResourceVersion:"665", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"64877df8f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8", Pod:"calico-apiserver-64877df8f5-fjq62", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib93bb6d70fc", MAC:"8e:84:19:61:d5:67", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:48.880390 containerd[1568]: 2025-02-13 16:02:48.877 [INFO][4781] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8" Namespace="calico-apiserver" Pod="calico-apiserver-64877df8f5-fjq62" WorkloadEndpoint="localhost-k8s-calico--apiserver--64877df8f5--fjq62-eth0" Feb 13 16:02:48.900866 containerd[1568]: time="2025-02-13T16:02:48.900142854Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:48.900866 containerd[1568]: time="2025-02-13T16:02:48.900184554Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:48.900866 containerd[1568]: time="2025-02-13T16:02:48.900192811Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:48.900866 containerd[1568]: time="2025-02-13T16:02:48.900245359Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:48.915655 systemd[1]: run-containerd-runc-k8s.io-1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8-runc.3E38QW.mount: Deactivated successfully. Feb 13 16:02:48.931051 systemd[1]: Started cri-containerd-1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8.scope - libcontainer container 1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8. Feb 13 16:02:48.959119 systemd-resolved[1487]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:02:48.965279 systemd-networkd[1271]: calic971932ce6a: Link UP Feb 13 16:02:48.966623 systemd-networkd[1271]: calic971932ce6a: Gained carrier Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.414 [INFO][4832] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.430 [INFO][4832] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0 coredns-6f6b679f8f- kube-system 661d4ddb-a267-4645-9fad-e5fa882fa0db 663 0 2025-02-13 16:02:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-d2wxt eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic971932ce6a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.430 [INFO][4832] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.716 [INFO][4864] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" HandleID="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Workload="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4864] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" HandleID="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Workload="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003396c0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-d2wxt", "timestamp":"2025-02-13 16:02:48.716460617 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4864] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.861 [INFO][4864] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.863 [INFO][4864] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.938 [INFO][4864] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.940 [INFO][4864] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.947 [INFO][4864] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.948 [INFO][4864] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.949 [INFO][4864] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.949 [INFO][4864] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.949 [INFO][4864] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410 Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.955 [INFO][4864] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.960 [INFO][4864] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.960 [INFO][4864] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" host="localhost" Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.960 [INFO][4864] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:02:48.981026 containerd[1568]: 2025-02-13 16:02:48.960 [INFO][4864] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" HandleID="k8s-pod-network.8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Workload="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" Feb 13 16:02:48.981827 containerd[1568]: 2025-02-13 16:02:48.962 [INFO][4832] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"661d4ddb-a267-4645-9fad-e5fa882fa0db", ResourceVersion:"663", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-d2wxt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic971932ce6a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:48.981827 containerd[1568]: 2025-02-13 16:02:48.962 [INFO][4832] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" Feb 13 16:02:48.981827 containerd[1568]: 2025-02-13 16:02:48.962 [INFO][4832] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic971932ce6a ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" Feb 13 16:02:48.981827 containerd[1568]: 2025-02-13 16:02:48.966 [INFO][4832] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" Feb 13 16:02:48.981827 containerd[1568]: 2025-02-13 16:02:48.967 [INFO][4832] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"661d4ddb-a267-4645-9fad-e5fa882fa0db", ResourceVersion:"663", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410", Pod:"coredns-6f6b679f8f-d2wxt", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic971932ce6a", MAC:"32:36:d0:cc:93:7e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:48.981827 containerd[1568]: 2025-02-13 16:02:48.977 [INFO][4832] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410" Namespace="kube-system" Pod="coredns-6f6b679f8f-d2wxt" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--d2wxt-eth0" Feb 13 16:02:49.012482 containerd[1568]: time="2025-02-13T16:02:49.012256843Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:49.012482 containerd[1568]: time="2025-02-13T16:02:49.012297345Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:49.012482 containerd[1568]: time="2025-02-13T16:02:49.012305154Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.012482 containerd[1568]: time="2025-02-13T16:02:49.012372411Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.032083 systemd[1]: Started cri-containerd-8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410.scope - libcontainer container 8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410. Feb 13 16:02:49.039403 containerd[1568]: time="2025-02-13T16:02:49.039374886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-64877df8f5-fjq62,Uid:ba8e0efe-98a0-4c27-9797-c702cafd7556,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8\"" Feb 13 16:02:49.049418 systemd-resolved[1487]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:02:49.078866 systemd-networkd[1271]: cali2b58967dd27: Link UP Feb 13 16:02:49.079322 systemd-networkd[1271]: cali2b58967dd27: Gained carrier Feb 13 16:02:49.099258 containerd[1568]: time="2025-02-13T16:02:49.099225321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-d2wxt,Uid:661d4ddb-a267-4645-9fad-e5fa882fa0db,Namespace:kube-system,Attempt:6,} returns sandbox id \"8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410\"" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.305 [INFO][4769] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.388 [INFO][4769] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--6f6b679f8f--dt72t-eth0 coredns-6f6b679f8f- kube-system 68c42bcf-2d0d-494d-9abe-69e38a01bdd9 666 0 2025-02-13 16:02:22 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-6f6b679f8f-dt72t eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali2b58967dd27 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.388 [INFO][4769] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.716 [INFO][4858] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" HandleID="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Workload="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4858] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" HandleID="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Workload="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103cd0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-6f6b679f8f-dt72t", "timestamp":"2025-02-13 16:02:48.71636396 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4858] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.960 [INFO][4858] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:48.960 [INFO][4858] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.042 [INFO][4858] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.047 [INFO][4858] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.051 [INFO][4858] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.053 [INFO][4858] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.056 [INFO][4858] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.056 [INFO][4858] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.057 [INFO][4858] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17 Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.060 [INFO][4858] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.068 [INFO][4858] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.068 [INFO][4858] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" host="localhost" Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.068 [INFO][4858] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:02:49.103429 containerd[1568]: 2025-02-13 16:02:49.068 [INFO][4858] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" HandleID="k8s-pod-network.6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Workload="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" Feb 13 16:02:49.105007 containerd[1568]: 2025-02-13 16:02:49.073 [INFO][4769] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--dt72t-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"68c42bcf-2d0d-494d-9abe-69e38a01bdd9", ResourceVersion:"666", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-6f6b679f8f-dt72t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b58967dd27", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:49.105007 containerd[1568]: 2025-02-13 16:02:49.073 [INFO][4769] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" Feb 13 16:02:49.105007 containerd[1568]: 2025-02-13 16:02:49.073 [INFO][4769] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b58967dd27 ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" Feb 13 16:02:49.105007 containerd[1568]: 2025-02-13 16:02:49.079 [INFO][4769] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" Feb 13 16:02:49.105007 containerd[1568]: 2025-02-13 16:02:49.079 [INFO][4769] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--6f6b679f8f--dt72t-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"68c42bcf-2d0d-494d-9abe-69e38a01bdd9", ResourceVersion:"666", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 22, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17", Pod:"coredns-6f6b679f8f-dt72t", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali2b58967dd27", MAC:"ce:ee:a3:93:d9:49", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:49.105007 containerd[1568]: 2025-02-13 16:02:49.097 [INFO][4769] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17" Namespace="kube-system" Pod="coredns-6f6b679f8f-dt72t" WorkloadEndpoint="localhost-k8s-coredns--6f6b679f8f--dt72t-eth0" Feb 13 16:02:49.105411 containerd[1568]: time="2025-02-13T16:02:49.105387642Z" level=info msg="CreateContainer within sandbox \"8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 16:02:49.124143 containerd[1568]: time="2025-02-13T16:02:49.124010807Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:49.124143 containerd[1568]: time="2025-02-13T16:02:49.124042735Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:49.124143 containerd[1568]: time="2025-02-13T16:02:49.124049340Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.124259 containerd[1568]: time="2025-02-13T16:02:49.124113398Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.142028 containerd[1568]: time="2025-02-13T16:02:49.141987216Z" level=info msg="CreateContainer within sandbox \"8b195495990155b3f50917be1e28f323749eb7682455a52337f9a28721784410\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7bbe48a423eae4b61b9fa76b09e8df0d0fb2643fc2e9b9a8fa801fb73f64bc8b\"" Feb 13 16:02:49.144329 containerd[1568]: time="2025-02-13T16:02:49.144311287Z" level=info msg="StartContainer for \"7bbe48a423eae4b61b9fa76b09e8df0d0fb2643fc2e9b9a8fa801fb73f64bc8b\"" Feb 13 16:02:49.150010 systemd[1]: Started cri-containerd-6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17.scope - libcontainer container 6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17. Feb 13 16:02:49.171683 systemd-resolved[1487]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:02:49.184039 systemd[1]: Started cri-containerd-7bbe48a423eae4b61b9fa76b09e8df0d0fb2643fc2e9b9a8fa801fb73f64bc8b.scope - libcontainer container 7bbe48a423eae4b61b9fa76b09e8df0d0fb2643fc2e9b9a8fa801fb73f64bc8b. Feb 13 16:02:49.191445 systemd-networkd[1271]: calic8c72a681c8: Link UP Feb 13 16:02:49.192038 systemd-networkd[1271]: calic8c72a681c8: Gained carrier Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:48.389 [INFO][4818] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:48.403 [INFO][4818] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0 calico-kube-controllers-cd99f8587- calico-system c05aa49b-8336-4728-802a-0e8f6d326479 668 0 2025-02-13 16:02:29 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:cd99f8587 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-cd99f8587-nvllq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calic8c72a681c8 [] []}} ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:48.403 [INFO][4818] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:48.716 [INFO][4860] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" HandleID="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Workload="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4860] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" HandleID="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Workload="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ba7e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-cd99f8587-nvllq", "timestamp":"2025-02-13 16:02:48.716283247 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:48.734 [INFO][4860] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.068 [INFO][4860] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.068 [INFO][4860] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.140 [INFO][4860] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.151 [INFO][4860] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.157 [INFO][4860] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.160 [INFO][4860] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.170 [INFO][4860] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.170 [INFO][4860] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.175 [INFO][4860] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32 Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.177 [INFO][4860] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.184 [INFO][4860] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.185 [INFO][4860] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" host="localhost" Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.185 [INFO][4860] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:02:49.208721 containerd[1568]: 2025-02-13 16:02:49.185 [INFO][4860] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" HandleID="k8s-pod-network.160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Workload="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" Feb 13 16:02:49.209272 containerd[1568]: 2025-02-13 16:02:49.187 [INFO][4818] cni-plugin/k8s.go 386: Populated endpoint ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0", GenerateName:"calico-kube-controllers-cd99f8587-", Namespace:"calico-system", SelfLink:"", UID:"c05aa49b-8336-4728-802a-0e8f6d326479", ResourceVersion:"668", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cd99f8587", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-cd99f8587-nvllq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic8c72a681c8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:49.209272 containerd[1568]: 2025-02-13 16:02:49.187 [INFO][4818] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" Feb 13 16:02:49.209272 containerd[1568]: 2025-02-13 16:02:49.187 [INFO][4818] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic8c72a681c8 ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" Feb 13 16:02:49.209272 containerd[1568]: 2025-02-13 16:02:49.193 [INFO][4818] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" Feb 13 16:02:49.209272 containerd[1568]: 2025-02-13 16:02:49.193 [INFO][4818] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0", GenerateName:"calico-kube-controllers-cd99f8587-", Namespace:"calico-system", SelfLink:"", UID:"c05aa49b-8336-4728-802a-0e8f6d326479", ResourceVersion:"668", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"cd99f8587", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32", Pod:"calico-kube-controllers-cd99f8587-nvllq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calic8c72a681c8", MAC:"86:be:b8:48:49:c3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:49.209272 containerd[1568]: 2025-02-13 16:02:49.204 [INFO][4818] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32" Namespace="calico-system" Pod="calico-kube-controllers-cd99f8587-nvllq" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--cd99f8587--nvllq-eth0" Feb 13 16:02:49.235463 containerd[1568]: time="2025-02-13T16:02:49.235404087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-dt72t,Uid:68c42bcf-2d0d-494d-9abe-69e38a01bdd9,Namespace:kube-system,Attempt:6,} returns sandbox id \"6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17\"" Feb 13 16:02:49.238883 containerd[1568]: time="2025-02-13T16:02:49.238861763Z" level=info msg="CreateContainer within sandbox \"6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 16:02:49.270306 containerd[1568]: time="2025-02-13T16:02:49.270214218Z" level=info msg="CreateContainer within sandbox \"6c75481de14c232071a52f95bcce5cc775b7386c32fae0e08da9f74ca01bed17\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"30d3bdf9faf2855a90ba8dba3332e92a8d983882b69b62eebfa5a42bffa4201c\"" Feb 13 16:02:49.274481 containerd[1568]: time="2025-02-13T16:02:49.274419450Z" level=info msg="StartContainer for \"7bbe48a423eae4b61b9fa76b09e8df0d0fb2643fc2e9b9a8fa801fb73f64bc8b\" returns successfully" Feb 13 16:02:49.275701 containerd[1568]: time="2025-02-13T16:02:49.274772044Z" level=info msg="StartContainer for \"30d3bdf9faf2855a90ba8dba3332e92a8d983882b69b62eebfa5a42bffa4201c\"" Feb 13 16:02:49.280215 containerd[1568]: time="2025-02-13T16:02:49.270811386Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:49.280215 containerd[1568]: time="2025-02-13T16:02:49.272317953Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:49.280215 containerd[1568]: time="2025-02-13T16:02:49.272330080Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.280215 containerd[1568]: time="2025-02-13T16:02:49.272433667Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.301185 systemd[1]: Started cri-containerd-160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32.scope - libcontainer container 160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32. Feb 13 16:02:49.317851 systemd-networkd[1271]: cali2f9a68ae882: Link UP Feb 13 16:02:49.327396 kubelet[2825]: I0213 16:02:49.327306 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-d2wxt" podStartSLOduration=27.327021775 podStartE2EDuration="27.327021775s" podCreationTimestamp="2025-02-13 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:02:49.32507873 +0000 UTC m=+31.862433137" watchObservedRunningTime="2025-02-13 16:02:49.327021775 +0000 UTC m=+31.864376185" Feb 13 16:02:49.330456 systemd-networkd[1271]: cali2f9a68ae882: Gained carrier Feb 13 16:02:49.345076 systemd[1]: Started cri-containerd-30d3bdf9faf2855a90ba8dba3332e92a8d983882b69b62eebfa5a42bffa4201c.scope - libcontainer container 30d3bdf9faf2855a90ba8dba3332e92a8d983882b69b62eebfa5a42bffa4201c. Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:48.354 [INFO][4800] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:48.388 [INFO][4800] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--cv4n7-eth0 csi-node-driver- calico-system fb1149e0-8e00-49ff-a8bd-416370ecd365 580 0 2025-02-13 16:02:29 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:56747c9949 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-cv4n7 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali2f9a68ae882 [] []}} ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:48.388 [INFO][4800] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-eth0" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:48.716 [INFO][4859] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" HandleID="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Workload="localhost-k8s-csi--node--driver--cv4n7-eth0" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:48.736 [INFO][4859] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" HandleID="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Workload="localhost-k8s-csi--node--driver--cv4n7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-cv4n7", "timestamp":"2025-02-13 16:02:48.716223876 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:48.736 [INFO][4859] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.185 [INFO][4859] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.185 [INFO][4859] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.244 [INFO][4859] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.252 [INFO][4859] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.260 [INFO][4859] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.262 [INFO][4859] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.270 [INFO][4859] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.271 [INFO][4859] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.276 [INFO][4859] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18 Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.283 [INFO][4859] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.289 [INFO][4859] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.290 [INFO][4859] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" host="localhost" Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.291 [INFO][4859] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:02:49.351053 containerd[1568]: 2025-02-13 16:02:49.291 [INFO][4859] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" HandleID="k8s-pod-network.29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Workload="localhost-k8s-csi--node--driver--cv4n7-eth0" Feb 13 16:02:49.353848 containerd[1568]: 2025-02-13 16:02:49.306 [INFO][4800] cni-plugin/k8s.go 386: Populated endpoint ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--cv4n7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fb1149e0-8e00-49ff-a8bd-416370ecd365", ResourceVersion:"580", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-cv4n7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2f9a68ae882", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:49.353848 containerd[1568]: 2025-02-13 16:02:49.307 [INFO][4800] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-eth0" Feb 13 16:02:49.353848 containerd[1568]: 2025-02-13 16:02:49.307 [INFO][4800] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2f9a68ae882 ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-eth0" Feb 13 16:02:49.353848 containerd[1568]: 2025-02-13 16:02:49.332 [INFO][4800] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-eth0" Feb 13 16:02:49.353848 containerd[1568]: 2025-02-13 16:02:49.333 [INFO][4800] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--cv4n7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"fb1149e0-8e00-49ff-a8bd-416370ecd365", ResourceVersion:"580", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 2, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"56747c9949", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18", Pod:"csi-node-driver-cv4n7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali2f9a68ae882", MAC:"b6:26:57:d2:f7:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:02:49.353848 containerd[1568]: 2025-02-13 16:02:49.348 [INFO][4800] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18" Namespace="calico-system" Pod="csi-node-driver-cv4n7" WorkloadEndpoint="localhost-k8s-csi--node--driver--cv4n7-eth0" Feb 13 16:02:49.390032 systemd-resolved[1487]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:02:49.391110 containerd[1568]: time="2025-02-13T16:02:49.391090614Z" level=info msg="StartContainer for \"30d3bdf9faf2855a90ba8dba3332e92a8d983882b69b62eebfa5a42bffa4201c\" returns successfully" Feb 13 16:02:49.394556 containerd[1568]: time="2025-02-13T16:02:49.394250005Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:02:49.394556 containerd[1568]: time="2025-02-13T16:02:49.394286427Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:02:49.394556 containerd[1568]: time="2025-02-13T16:02:49.394293465Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.394556 containerd[1568]: time="2025-02-13T16:02:49.394349950Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:02:49.413114 systemd[1]: Started cri-containerd-29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18.scope - libcontainer container 29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18. Feb 13 16:02:49.477678 systemd-resolved[1487]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:02:49.487719 containerd[1568]: time="2025-02-13T16:02:49.487647980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cv4n7,Uid:fb1149e0-8e00-49ff-a8bd-416370ecd365,Namespace:calico-system,Attempt:7,} returns sandbox id \"29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18\"" Feb 13 16:02:49.510490 containerd[1568]: time="2025-02-13T16:02:49.510340824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-cd99f8587-nvllq,Uid:c05aa49b-8336-4728-802a-0e8f6d326479,Namespace:calico-system,Attempt:6,} returns sandbox id \"160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32\"" Feb 13 16:02:49.583929 kernel: bpftool[5421]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 16:02:49.762115 systemd-networkd[1271]: vxlan.calico: Link UP Feb 13 16:02:49.762119 systemd-networkd[1271]: vxlan.calico: Gained carrier Feb 13 16:02:50.285093 systemd-networkd[1271]: cali2b58967dd27: Gained IPv6LL Feb 13 16:02:50.357328 kubelet[2825]: I0213 16:02:50.357105 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-dt72t" podStartSLOduration=28.357092949 podStartE2EDuration="28.357092949s" podCreationTimestamp="2025-02-13 16:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:02:50.347710113 +0000 UTC m=+32.885064528" watchObservedRunningTime="2025-02-13 16:02:50.357092949 +0000 UTC m=+32.894447359" Feb 13 16:02:50.413995 systemd-networkd[1271]: cali3ffea42fa42: Gained IPv6LL Feb 13 16:02:50.733194 systemd-networkd[1271]: calib93bb6d70fc: Gained IPv6LL Feb 13 16:02:50.797053 systemd-networkd[1271]: vxlan.calico: Gained IPv6LL Feb 13 16:02:50.861074 systemd-networkd[1271]: calic971932ce6a: Gained IPv6LL Feb 13 16:02:50.861372 systemd-networkd[1271]: calic8c72a681c8: Gained IPv6LL Feb 13 16:02:50.926000 systemd-networkd[1271]: cali2f9a68ae882: Gained IPv6LL Feb 13 16:02:51.906478 containerd[1568]: time="2025-02-13T16:02:51.906438893Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:51.907504 containerd[1568]: time="2025-02-13T16:02:51.907353908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Feb 13 16:02:51.907855 containerd[1568]: time="2025-02-13T16:02:51.907828614Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:51.908895 containerd[1568]: time="2025-02-13T16:02:51.908865033Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:51.909454 containerd[1568]: time="2025-02-13T16:02:51.909346120Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 3.05081869s" Feb 13 16:02:51.909454 containerd[1568]: time="2025-02-13T16:02:51.909363297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 16:02:51.910356 containerd[1568]: time="2025-02-13T16:02:51.910337854Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 16:02:51.911130 containerd[1568]: time="2025-02-13T16:02:51.910813658Z" level=info msg="CreateContainer within sandbox \"1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 16:02:51.925831 containerd[1568]: time="2025-02-13T16:02:51.925799771Z" level=info msg="CreateContainer within sandbox \"1dfa654e890b52b0b3133bc28fdb613bb16cd4918452e49ccdaab7e60bf5fa87\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b4076ba1fa9138f2462c1d64b7cc4cb160aa8a61743cc337421069f35909fb50\"" Feb 13 16:02:51.926685 containerd[1568]: time="2025-02-13T16:02:51.926663263Z" level=info msg="StartContainer for \"b4076ba1fa9138f2462c1d64b7cc4cb160aa8a61743cc337421069f35909fb50\"" Feb 13 16:02:51.946040 systemd[1]: Started cri-containerd-b4076ba1fa9138f2462c1d64b7cc4cb160aa8a61743cc337421069f35909fb50.scope - libcontainer container b4076ba1fa9138f2462c1d64b7cc4cb160aa8a61743cc337421069f35909fb50. Feb 13 16:02:51.978406 containerd[1568]: time="2025-02-13T16:02:51.978381227Z" level=info msg="StartContainer for \"b4076ba1fa9138f2462c1d64b7cc4cb160aa8a61743cc337421069f35909fb50\" returns successfully" Feb 13 16:02:52.341710 kubelet[2825]: I0213 16:02:52.341549 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64877df8f5-rqk9c" podStartSLOduration=21.289985871 podStartE2EDuration="24.341519615s" podCreationTimestamp="2025-02-13 16:02:28 +0000 UTC" firstStartedPulling="2025-02-13 16:02:48.858309017 +0000 UTC m=+31.395663423" lastFinishedPulling="2025-02-13 16:02:51.909842761 +0000 UTC m=+34.447197167" observedRunningTime="2025-02-13 16:02:52.340558838 +0000 UTC m=+34.877913262" watchObservedRunningTime="2025-02-13 16:02:52.341519615 +0000 UTC m=+34.878874043" Feb 13 16:02:52.438528 containerd[1568]: time="2025-02-13T16:02:52.438243705Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:52.439242 containerd[1568]: time="2025-02-13T16:02:52.438984829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Feb 13 16:02:52.439895 containerd[1568]: time="2025-02-13T16:02:52.439873819Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 529.517039ms" Feb 13 16:02:52.439895 containerd[1568]: time="2025-02-13T16:02:52.439890513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 16:02:52.441747 containerd[1568]: time="2025-02-13T16:02:52.441729080Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 16:02:52.442084 containerd[1568]: time="2025-02-13T16:02:52.442067493Z" level=info msg="CreateContainer within sandbox \"1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 16:02:52.449084 containerd[1568]: time="2025-02-13T16:02:52.449058622Z" level=info msg="CreateContainer within sandbox \"1ec9f9ff52893e97874dc8d5b70e0d98a4c4bea360278cf9ab3ff84aaf2c5ae8\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e5ea6080689e65f90ebea891f94a3ebeabdd184f8accd09a93a3ab390b6ad7bc\"" Feb 13 16:02:52.449496 containerd[1568]: time="2025-02-13T16:02:52.449486156Z" level=info msg="StartContainer for \"e5ea6080689e65f90ebea891f94a3ebeabdd184f8accd09a93a3ab390b6ad7bc\"" Feb 13 16:02:52.495208 systemd[1]: Started cri-containerd-e5ea6080689e65f90ebea891f94a3ebeabdd184f8accd09a93a3ab390b6ad7bc.scope - libcontainer container e5ea6080689e65f90ebea891f94a3ebeabdd184f8accd09a93a3ab390b6ad7bc. Feb 13 16:02:52.550626 containerd[1568]: time="2025-02-13T16:02:52.550598162Z" level=info msg="StartContainer for \"e5ea6080689e65f90ebea891f94a3ebeabdd184f8accd09a93a3ab390b6ad7bc\" returns successfully" Feb 13 16:02:53.343891 kubelet[2825]: I0213 16:02:53.343809 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-64877df8f5-fjq62" podStartSLOduration=21.945116872 podStartE2EDuration="25.343777563s" podCreationTimestamp="2025-02-13 16:02:28 +0000 UTC" firstStartedPulling="2025-02-13 16:02:49.041603825 +0000 UTC m=+31.578958237" lastFinishedPulling="2025-02-13 16:02:52.440264521 +0000 UTC m=+34.977618928" observedRunningTime="2025-02-13 16:02:53.342424729 +0000 UTC m=+35.879779143" watchObservedRunningTime="2025-02-13 16:02:53.343777563 +0000 UTC m=+35.881131979" Feb 13 16:02:54.146496 containerd[1568]: time="2025-02-13T16:02:54.146467094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:54.147207 containerd[1568]: time="2025-02-13T16:02:54.146796482Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 16:02:54.147427 containerd[1568]: time="2025-02-13T16:02:54.147395323Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:54.148924 containerd[1568]: time="2025-02-13T16:02:54.148816139Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:54.149655 containerd[1568]: time="2025-02-13T16:02:54.149532642Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.707706325s" Feb 13 16:02:54.149655 containerd[1568]: time="2025-02-13T16:02:54.149558695Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 16:02:54.150622 containerd[1568]: time="2025-02-13T16:02:54.150435274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Feb 13 16:02:54.151603 containerd[1568]: time="2025-02-13T16:02:54.151587466Z" level=info msg="CreateContainer within sandbox \"29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 16:02:54.182853 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount432930829.mount: Deactivated successfully. Feb 13 16:02:54.188676 containerd[1568]: time="2025-02-13T16:02:54.188650628Z" level=info msg="CreateContainer within sandbox \"29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"25dc99b8152fa7d12da40cb65639bb1b26dddc71568f59642de362f56766a3e8\"" Feb 13 16:02:54.189877 containerd[1568]: time="2025-02-13T16:02:54.189070690Z" level=info msg="StartContainer for \"25dc99b8152fa7d12da40cb65639bb1b26dddc71568f59642de362f56766a3e8\"" Feb 13 16:02:54.209461 systemd[1]: run-containerd-runc-k8s.io-25dc99b8152fa7d12da40cb65639bb1b26dddc71568f59642de362f56766a3e8-runc.vLkKI0.mount: Deactivated successfully. Feb 13 16:02:54.216059 systemd[1]: Started cri-containerd-25dc99b8152fa7d12da40cb65639bb1b26dddc71568f59642de362f56766a3e8.scope - libcontainer container 25dc99b8152fa7d12da40cb65639bb1b26dddc71568f59642de362f56766a3e8. Feb 13 16:02:54.242292 containerd[1568]: time="2025-02-13T16:02:54.242264038Z" level=info msg="StartContainer for \"25dc99b8152fa7d12da40cb65639bb1b26dddc71568f59642de362f56766a3e8\" returns successfully" Feb 13 16:02:54.375290 kubelet[2825]: I0213 16:02:54.375266 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:02:56.623629 containerd[1568]: time="2025-02-13T16:02:56.623598382Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:56.630248 containerd[1568]: time="2025-02-13T16:02:56.628086250Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Feb 13 16:02:56.655970 containerd[1568]: time="2025-02-13T16:02:56.655895285Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:56.656480 containerd[1568]: time="2025-02-13T16:02:56.655932317Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.505479276s" Feb 13 16:02:56.656480 containerd[1568]: time="2025-02-13T16:02:56.656051293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Feb 13 16:02:56.657239 containerd[1568]: time="2025-02-13T16:02:56.657155918Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:56.658338 containerd[1568]: time="2025-02-13T16:02:56.658025016Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 16:02:56.669369 containerd[1568]: time="2025-02-13T16:02:56.669348720Z" level=info msg="CreateContainer within sandbox \"160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Feb 13 16:02:56.690389 containerd[1568]: time="2025-02-13T16:02:56.690364583Z" level=info msg="CreateContainer within sandbox \"160e26a9963f53328049a5b605078dcc4129aa2b1e4c45e62323a34a1e847d32\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"15327a8550e45cfaec4bde0db098bf66dca5e9ab27640dc0abd7863aeb6e2cbb\"" Feb 13 16:02:56.691131 containerd[1568]: time="2025-02-13T16:02:56.691086418Z" level=info msg="StartContainer for \"15327a8550e45cfaec4bde0db098bf66dca5e9ab27640dc0abd7863aeb6e2cbb\"" Feb 13 16:02:56.733036 systemd[1]: Started cri-containerd-15327a8550e45cfaec4bde0db098bf66dca5e9ab27640dc0abd7863aeb6e2cbb.scope - libcontainer container 15327a8550e45cfaec4bde0db098bf66dca5e9ab27640dc0abd7863aeb6e2cbb. Feb 13 16:02:56.777635 containerd[1568]: time="2025-02-13T16:02:56.777565052Z" level=info msg="StartContainer for \"15327a8550e45cfaec4bde0db098bf66dca5e9ab27640dc0abd7863aeb6e2cbb\" returns successfully" Feb 13 16:02:57.402867 kubelet[2825]: I0213 16:02:57.402819 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-cd99f8587-nvllq" podStartSLOduration=21.256853594 podStartE2EDuration="28.402803824s" podCreationTimestamp="2025-02-13 16:02:29 +0000 UTC" firstStartedPulling="2025-02-13 16:02:49.511416951 +0000 UTC m=+32.048771358" lastFinishedPulling="2025-02-13 16:02:56.657367178 +0000 UTC m=+39.194721588" observedRunningTime="2025-02-13 16:02:57.402375649 +0000 UTC m=+39.939730065" watchObservedRunningTime="2025-02-13 16:02:57.402803824 +0000 UTC m=+39.940158233" Feb 13 16:02:58.767472 containerd[1568]: time="2025-02-13T16:02:58.767439059Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:58.774037 containerd[1568]: time="2025-02-13T16:02:58.774002178Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 16:02:58.784619 containerd[1568]: time="2025-02-13T16:02:58.784591696Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:58.796368 containerd[1568]: time="2025-02-13T16:02:58.796326895Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:02:58.796972 containerd[1568]: time="2025-02-13T16:02:58.796647521Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.138601289s" Feb 13 16:02:58.796972 containerd[1568]: time="2025-02-13T16:02:58.796668072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 16:02:58.915386 containerd[1568]: time="2025-02-13T16:02:58.915236199Z" level=info msg="CreateContainer within sandbox \"29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 16:02:58.952641 containerd[1568]: time="2025-02-13T16:02:58.952588186Z" level=info msg="CreateContainer within sandbox \"29ffce9a9a4bfaeb28d2e387681153285a35430763590d10db184c83768b0e18\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"291bfaeefca72d340bf2879baa953801570eb04685d8642805aa27c66b976867\"" Feb 13 16:02:58.952925 containerd[1568]: time="2025-02-13T16:02:58.952865259Z" level=info msg="StartContainer for \"291bfaeefca72d340bf2879baa953801570eb04685d8642805aa27c66b976867\"" Feb 13 16:02:59.006096 systemd[1]: Started cri-containerd-291bfaeefca72d340bf2879baa953801570eb04685d8642805aa27c66b976867.scope - libcontainer container 291bfaeefca72d340bf2879baa953801570eb04685d8642805aa27c66b976867. Feb 13 16:02:59.035876 containerd[1568]: time="2025-02-13T16:02:59.035813234Z" level=info msg="StartContainer for \"291bfaeefca72d340bf2879baa953801570eb04685d8642805aa27c66b976867\" returns successfully" Feb 13 16:02:59.432350 kubelet[2825]: I0213 16:02:59.432310 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-cv4n7" podStartSLOduration=21.124295392 podStartE2EDuration="30.432296415s" podCreationTimestamp="2025-02-13 16:02:29 +0000 UTC" firstStartedPulling="2025-02-13 16:02:49.489019446 +0000 UTC m=+32.026373853" lastFinishedPulling="2025-02-13 16:02:58.79702047 +0000 UTC m=+41.334374876" observedRunningTime="2025-02-13 16:02:59.431977681 +0000 UTC m=+41.969332096" watchObservedRunningTime="2025-02-13 16:02:59.432296415 +0000 UTC m=+41.969650830" Feb 13 16:03:00.011940 kubelet[2825]: I0213 16:03:00.010739 2825 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 16:03:00.019393 kubelet[2825]: I0213 16:03:00.019366 2825 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 16:03:17.661484 containerd[1568]: time="2025-02-13T16:03:17.661455040Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:03:17.662053 containerd[1568]: time="2025-02-13T16:03:17.661922399Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:03:17.662053 containerd[1568]: time="2025-02-13T16:03:17.661932597Z" level=info msg="StopPodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:03:17.703720 containerd[1568]: time="2025-02-13T16:03:17.703687948Z" level=info msg="RemovePodSandbox for \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:03:17.716736 containerd[1568]: time="2025-02-13T16:03:17.716697395Z" level=info msg="Forcibly stopping sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\"" Feb 13 16:03:17.728262 containerd[1568]: time="2025-02-13T16:03:17.716794215Z" level=info msg="TearDown network for sandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" successfully" Feb 13 16:03:17.733668 containerd[1568]: time="2025-02-13T16:03:17.733626857Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.737731 containerd[1568]: time="2025-02-13T16:03:17.737685842Z" level=info msg="RemovePodSandbox \"e0efda2e80347012dd42e05621f376d64d3e9b086d650c17944b85e21cc9d010\" returns successfully" Feb 13 16:03:17.739617 containerd[1568]: time="2025-02-13T16:03:17.739591217Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:03:17.739692 containerd[1568]: time="2025-02-13T16:03:17.739675662Z" level=info msg="TearDown network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" successfully" Feb 13 16:03:17.739692 containerd[1568]: time="2025-02-13T16:03:17.739683902Z" level=info msg="StopPodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" returns successfully" Feb 13 16:03:17.740177 containerd[1568]: time="2025-02-13T16:03:17.739931448Z" level=info msg="RemovePodSandbox for \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:03:17.740177 containerd[1568]: time="2025-02-13T16:03:17.739948786Z" level=info msg="Forcibly stopping sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\"" Feb 13 16:03:17.740177 containerd[1568]: time="2025-02-13T16:03:17.739992208Z" level=info msg="TearDown network for sandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" successfully" Feb 13 16:03:17.742158 containerd[1568]: time="2025-02-13T16:03:17.742140590Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.742311 containerd[1568]: time="2025-02-13T16:03:17.742242754Z" level=info msg="RemovePodSandbox \"a48992fe64f816e130884e6efbce42bcbe9850338024ec51018864cf361533d1\" returns successfully" Feb 13 16:03:17.742508 containerd[1568]: time="2025-02-13T16:03:17.742484784Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" Feb 13 16:03:17.742556 containerd[1568]: time="2025-02-13T16:03:17.742545193Z" level=info msg="TearDown network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" successfully" Feb 13 16:03:17.742556 containerd[1568]: time="2025-02-13T16:03:17.742553065Z" level=info msg="StopPodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" returns successfully" Feb 13 16:03:17.743800 containerd[1568]: time="2025-02-13T16:03:17.742696253Z" level=info msg="RemovePodSandbox for \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" Feb 13 16:03:17.743800 containerd[1568]: time="2025-02-13T16:03:17.742720428Z" level=info msg="Forcibly stopping sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\"" Feb 13 16:03:17.743800 containerd[1568]: time="2025-02-13T16:03:17.742764560Z" level=info msg="TearDown network for sandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" successfully" Feb 13 16:03:17.744597 containerd[1568]: time="2025-02-13T16:03:17.744578062Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.744646 containerd[1568]: time="2025-02-13T16:03:17.744613453Z" level=info msg="RemovePodSandbox \"a04164dd2bdc95b1ce91bffb782defee913b956e75607904b8ed2b4c6d48a59c\" returns successfully" Feb 13 16:03:17.744961 containerd[1568]: time="2025-02-13T16:03:17.744897202Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\"" Feb 13 16:03:17.745113 containerd[1568]: time="2025-02-13T16:03:17.744952594Z" level=info msg="TearDown network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" successfully" Feb 13 16:03:17.745113 containerd[1568]: time="2025-02-13T16:03:17.745018109Z" level=info msg="StopPodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" returns successfully" Feb 13 16:03:17.746366 containerd[1568]: time="2025-02-13T16:03:17.745297035Z" level=info msg="RemovePodSandbox for \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\"" Feb 13 16:03:17.746366 containerd[1568]: time="2025-02-13T16:03:17.745312150Z" level=info msg="Forcibly stopping sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\"" Feb 13 16:03:17.746366 containerd[1568]: time="2025-02-13T16:03:17.745353429Z" level=info msg="TearDown network for sandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" successfully" Feb 13 16:03:17.746821 containerd[1568]: time="2025-02-13T16:03:17.746807739Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.746881 containerd[1568]: time="2025-02-13T16:03:17.746872912Z" level=info msg="RemovePodSandbox \"ddbee190e5df87d79973e2eacebb16a3dfbb1cd8c4fe0a2c4c9ac61d34d085d9\" returns successfully" Feb 13 16:03:17.747129 containerd[1568]: time="2025-02-13T16:03:17.747107457Z" level=info msg="StopPodSandbox for \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\"" Feb 13 16:03:17.747180 containerd[1568]: time="2025-02-13T16:03:17.747165110Z" level=info msg="TearDown network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" successfully" Feb 13 16:03:17.747180 containerd[1568]: time="2025-02-13T16:03:17.747177062Z" level=info msg="StopPodSandbox for \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" returns successfully" Feb 13 16:03:17.747391 containerd[1568]: time="2025-02-13T16:03:17.747351653Z" level=info msg="RemovePodSandbox for \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\"" Feb 13 16:03:17.747420 containerd[1568]: time="2025-02-13T16:03:17.747384064Z" level=info msg="Forcibly stopping sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\"" Feb 13 16:03:17.747515 containerd[1568]: time="2025-02-13T16:03:17.747444723Z" level=info msg="TearDown network for sandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" successfully" Feb 13 16:03:17.750595 containerd[1568]: time="2025-02-13T16:03:17.750577144Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.750890 containerd[1568]: time="2025-02-13T16:03:17.750611768Z" level=info msg="RemovePodSandbox \"98d01d6c4bb6767deb2b29c11bb775f887be1a1d3a15cbe58169cbb78346e015\" returns successfully" Feb 13 16:03:17.750890 containerd[1568]: time="2025-02-13T16:03:17.750798072Z" level=info msg="StopPodSandbox for \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\"" Feb 13 16:03:17.750890 containerd[1568]: time="2025-02-13T16:03:17.750845755Z" level=info msg="TearDown network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\" successfully" Feb 13 16:03:17.750890 containerd[1568]: time="2025-02-13T16:03:17.750852595Z" level=info msg="StopPodSandbox for \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\" returns successfully" Feb 13 16:03:17.751071 containerd[1568]: time="2025-02-13T16:03:17.751053507Z" level=info msg="RemovePodSandbox for \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\"" Feb 13 16:03:17.751167 containerd[1568]: time="2025-02-13T16:03:17.751136182Z" level=info msg="Forcibly stopping sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\"" Feb 13 16:03:17.751265 containerd[1568]: time="2025-02-13T16:03:17.751223185Z" level=info msg="TearDown network for sandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\" successfully" Feb 13 16:03:17.757930 containerd[1568]: time="2025-02-13T16:03:17.757910288Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.757980 containerd[1568]: time="2025-02-13T16:03:17.757943761Z" level=info msg="RemovePodSandbox \"e4357ee6f4dcfa053a1d247c9470338a35534f1194ad471f4002ceac7164917b\" returns successfully" Feb 13 16:03:17.758192 containerd[1568]: time="2025-02-13T16:03:17.758110879Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:03:17.758192 containerd[1568]: time="2025-02-13T16:03:17.758157711Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:03:17.758192 containerd[1568]: time="2025-02-13T16:03:17.758164019Z" level=info msg="StopPodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:03:17.764097 containerd[1568]: time="2025-02-13T16:03:17.758329977Z" level=info msg="RemovePodSandbox for \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:03:17.764097 containerd[1568]: time="2025-02-13T16:03:17.758343312Z" level=info msg="Forcibly stopping sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\"" Feb 13 16:03:17.764097 containerd[1568]: time="2025-02-13T16:03:17.758412425Z" level=info msg="TearDown network for sandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" successfully" Feb 13 16:03:17.766715 containerd[1568]: time="2025-02-13T16:03:17.766698896Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.766752 containerd[1568]: time="2025-02-13T16:03:17.766730866Z" level=info msg="RemovePodSandbox \"8b5ee367743353821308738c8ab54a091f3500dc1fcd9077cbbfd4ea71860ac2\" returns successfully" Feb 13 16:03:17.767010 containerd[1568]: time="2025-02-13T16:03:17.766928589Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:03:17.767010 containerd[1568]: time="2025-02-13T16:03:17.766971342Z" level=info msg="TearDown network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" successfully" Feb 13 16:03:17.767010 containerd[1568]: time="2025-02-13T16:03:17.766977756Z" level=info msg="StopPodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" returns successfully" Feb 13 16:03:17.767324 containerd[1568]: time="2025-02-13T16:03:17.767152913Z" level=info msg="RemovePodSandbox for \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:03:17.767324 containerd[1568]: time="2025-02-13T16:03:17.767244242Z" level=info msg="Forcibly stopping sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\"" Feb 13 16:03:17.767324 containerd[1568]: time="2025-02-13T16:03:17.767283490Z" level=info msg="TearDown network for sandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" successfully" Feb 13 16:03:17.774234 containerd[1568]: time="2025-02-13T16:03:17.774171198Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.774234 containerd[1568]: time="2025-02-13T16:03:17.774198966Z" level=info msg="RemovePodSandbox \"198aaec023f912eda4f82977bb23bd60d63141d26e65a469e9d8d22d0727f2e5\" returns successfully" Feb 13 16:03:17.774384 containerd[1568]: time="2025-02-13T16:03:17.774368953Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" Feb 13 16:03:17.779845 containerd[1568]: time="2025-02-13T16:03:17.774418515Z" level=info msg="TearDown network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" successfully" Feb 13 16:03:17.779845 containerd[1568]: time="2025-02-13T16:03:17.774425247Z" level=info msg="StopPodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" returns successfully" Feb 13 16:03:17.779845 containerd[1568]: time="2025-02-13T16:03:17.774547162Z" level=info msg="RemovePodSandbox for \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" Feb 13 16:03:17.779845 containerd[1568]: time="2025-02-13T16:03:17.774557914Z" level=info msg="Forcibly stopping sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\"" Feb 13 16:03:17.779845 containerd[1568]: time="2025-02-13T16:03:17.774694826Z" level=info msg="TearDown network for sandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" successfully" Feb 13 16:03:17.781054 containerd[1568]: time="2025-02-13T16:03:17.781024491Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.781100 containerd[1568]: time="2025-02-13T16:03:17.781063347Z" level=info msg="RemovePodSandbox \"0b108dd6120a103cb8bb8066fdbfc8fc7ea8f7b4d9d8ce0c53ca5adfd9548828\" returns successfully" Feb 13 16:03:17.781271 containerd[1568]: time="2025-02-13T16:03:17.781256662Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\"" Feb 13 16:03:17.785303 containerd[1568]: time="2025-02-13T16:03:17.781555662Z" level=info msg="TearDown network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" successfully" Feb 13 16:03:17.785303 containerd[1568]: time="2025-02-13T16:03:17.781566884Z" level=info msg="StopPodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" returns successfully" Feb 13 16:03:17.785303 containerd[1568]: time="2025-02-13T16:03:17.781794101Z" level=info msg="RemovePodSandbox for \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\"" Feb 13 16:03:17.785303 containerd[1568]: time="2025-02-13T16:03:17.781807814Z" level=info msg="Forcibly stopping sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\"" Feb 13 16:03:17.785303 containerd[1568]: time="2025-02-13T16:03:17.781924516Z" level=info msg="TearDown network for sandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" successfully" Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787159387Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787189277Z" level=info msg="RemovePodSandbox \"347870b235190ac86b3b1877079488050075c85dd95ba5cf3d9ae9abca826f47\" returns successfully" Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787386266Z" level=info msg="StopPodSandbox for \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\"" Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787451288Z" level=info msg="TearDown network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" successfully" Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787462305Z" level=info msg="StopPodSandbox for \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" returns successfully" Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787672269Z" level=info msg="RemovePodSandbox for \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\"" Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787691946Z" level=info msg="Forcibly stopping sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\"" Feb 13 16:03:17.789686 containerd[1568]: time="2025-02-13T16:03:17.787743659Z" level=info msg="TearDown network for sandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" successfully" Feb 13 16:03:17.792482 containerd[1568]: time="2025-02-13T16:03:17.792452134Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.792521 containerd[1568]: time="2025-02-13T16:03:17.792512131Z" level=info msg="RemovePodSandbox \"1586e1f49e2215e2f3d817472736e8515339007c23c1cd8f3c1d34be8a2a107e\" returns successfully" Feb 13 16:03:17.792852 containerd[1568]: time="2025-02-13T16:03:17.792836535Z" level=info msg="StopPodSandbox for \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\"" Feb 13 16:03:17.792952 containerd[1568]: time="2025-02-13T16:03:17.792936434Z" level=info msg="TearDown network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\" successfully" Feb 13 16:03:17.793326 containerd[1568]: time="2025-02-13T16:03:17.792952269Z" level=info msg="StopPodSandbox for \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\" returns successfully" Feb 13 16:03:17.793326 containerd[1568]: time="2025-02-13T16:03:17.793134195Z" level=info msg="RemovePodSandbox for \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\"" Feb 13 16:03:17.793326 containerd[1568]: time="2025-02-13T16:03:17.793147460Z" level=info msg="Forcibly stopping sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\"" Feb 13 16:03:17.793326 containerd[1568]: time="2025-02-13T16:03:17.793222702Z" level=info msg="TearDown network for sandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\" successfully" Feb 13 16:03:17.795270 containerd[1568]: time="2025-02-13T16:03:17.795250622Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.795415 containerd[1568]: time="2025-02-13T16:03:17.795283601Z" level=info msg="RemovePodSandbox \"3b11d5cd5d0843a6a25ba5814e19e7ad43ede61d375c8999f508cd515c5c6afd\" returns successfully" Feb 13 16:03:17.795561 containerd[1568]: time="2025-02-13T16:03:17.795543163Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:03:17.796767 containerd[1568]: time="2025-02-13T16:03:17.795698307Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:03:17.796767 containerd[1568]: time="2025-02-13T16:03:17.795735803Z" level=info msg="StopPodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:03:17.796767 containerd[1568]: time="2025-02-13T16:03:17.795951294Z" level=info msg="RemovePodSandbox for \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:03:17.796767 containerd[1568]: time="2025-02-13T16:03:17.795964435Z" level=info msg="Forcibly stopping sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\"" Feb 13 16:03:17.796767 containerd[1568]: time="2025-02-13T16:03:17.796054765Z" level=info msg="TearDown network for sandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" successfully" Feb 13 16:03:17.797410 containerd[1568]: time="2025-02-13T16:03:17.797393079Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.797441 containerd[1568]: time="2025-02-13T16:03:17.797420131Z" level=info msg="RemovePodSandbox \"4ae66e262c460c2f5650b2a09a118ac0782db1e3f48ac5eb7e9528431c348c68\" returns successfully" Feb 13 16:03:17.797649 containerd[1568]: time="2025-02-13T16:03:17.797635318Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:03:17.797693 containerd[1568]: time="2025-02-13T16:03:17.797680609Z" level=info msg="TearDown network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" successfully" Feb 13 16:03:17.797724 containerd[1568]: time="2025-02-13T16:03:17.797700222Z" level=info msg="StopPodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" returns successfully" Feb 13 16:03:17.798462 containerd[1568]: time="2025-02-13T16:03:17.797917213Z" level=info msg="RemovePodSandbox for \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:03:17.798462 containerd[1568]: time="2025-02-13T16:03:17.797928740Z" level=info msg="Forcibly stopping sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\"" Feb 13 16:03:17.798462 containerd[1568]: time="2025-02-13T16:03:17.797959465Z" level=info msg="TearDown network for sandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" successfully" Feb 13 16:03:17.799263 containerd[1568]: time="2025-02-13T16:03:17.799245961Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.799290 containerd[1568]: time="2025-02-13T16:03:17.799275167Z" level=info msg="RemovePodSandbox \"f2fe6737255fe35acfb745898f08dd3761046e144d295aebf8482110e7b40fc2\" returns successfully" Feb 13 16:03:17.799508 containerd[1568]: time="2025-02-13T16:03:17.799413591Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" Feb 13 16:03:17.799508 containerd[1568]: time="2025-02-13T16:03:17.799455838Z" level=info msg="TearDown network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" successfully" Feb 13 16:03:17.799508 containerd[1568]: time="2025-02-13T16:03:17.799463436Z" level=info msg="StopPodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" returns successfully" Feb 13 16:03:17.799709 containerd[1568]: time="2025-02-13T16:03:17.799685293Z" level=info msg="RemovePodSandbox for \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" Feb 13 16:03:17.799745 containerd[1568]: time="2025-02-13T16:03:17.799705589Z" level=info msg="Forcibly stopping sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\"" Feb 13 16:03:17.799787 containerd[1568]: time="2025-02-13T16:03:17.799755564Z" level=info msg="TearDown network for sandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" successfully" Feb 13 16:03:17.801171 containerd[1568]: time="2025-02-13T16:03:17.801152753Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.801219 containerd[1568]: time="2025-02-13T16:03:17.801205324Z" level=info msg="RemovePodSandbox \"85c9610bf95696ff784dc9fa690dfe5dc6c36096b9e48683a8b31a8360e186e8\" returns successfully" Feb 13 16:03:17.801433 containerd[1568]: time="2025-02-13T16:03:17.801417549Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\"" Feb 13 16:03:17.801484 containerd[1568]: time="2025-02-13T16:03:17.801471335Z" level=info msg="TearDown network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" successfully" Feb 13 16:03:17.801484 containerd[1568]: time="2025-02-13T16:03:17.801481084Z" level=info msg="StopPodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" returns successfully" Feb 13 16:03:17.801638 containerd[1568]: time="2025-02-13T16:03:17.801623118Z" level=info msg="RemovePodSandbox for \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\"" Feb 13 16:03:17.801715 containerd[1568]: time="2025-02-13T16:03:17.801700759Z" level=info msg="Forcibly stopping sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\"" Feb 13 16:03:17.801777 containerd[1568]: time="2025-02-13T16:03:17.801750396Z" level=info msg="TearDown network for sandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" successfully" Feb 13 16:03:17.803216 containerd[1568]: time="2025-02-13T16:03:17.803200341Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.803255 containerd[1568]: time="2025-02-13T16:03:17.803229225Z" level=info msg="RemovePodSandbox \"b188da4bacd06ed245d1092b40deb30fcbee3f87d508bfcea72c523fca8f7a88\" returns successfully" Feb 13 16:03:17.803429 containerd[1568]: time="2025-02-13T16:03:17.803411221Z" level=info msg="StopPodSandbox for \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\"" Feb 13 16:03:17.803473 containerd[1568]: time="2025-02-13T16:03:17.803460915Z" level=info msg="TearDown network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" successfully" Feb 13 16:03:17.803473 containerd[1568]: time="2025-02-13T16:03:17.803470594Z" level=info msg="StopPodSandbox for \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" returns successfully" Feb 13 16:03:17.803637 containerd[1568]: time="2025-02-13T16:03:17.803622340Z" level=info msg="RemovePodSandbox for \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\"" Feb 13 16:03:17.803668 containerd[1568]: time="2025-02-13T16:03:17.803637609Z" level=info msg="Forcibly stopping sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\"" Feb 13 16:03:17.803701 containerd[1568]: time="2025-02-13T16:03:17.803691383Z" level=info msg="TearDown network for sandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" successfully" Feb 13 16:03:17.805252 containerd[1568]: time="2025-02-13T16:03:17.805230416Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.805351 containerd[1568]: time="2025-02-13T16:03:17.805265718Z" level=info msg="RemovePodSandbox \"d1a0fd3d4773ea8ed4f1352eea601638fcb937bf3ac77af38dac73e7c48790e4\" returns successfully" Feb 13 16:03:17.805478 containerd[1568]: time="2025-02-13T16:03:17.805443181Z" level=info msg="StopPodSandbox for \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\"" Feb 13 16:03:17.805508 containerd[1568]: time="2025-02-13T16:03:17.805500650Z" level=info msg="TearDown network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\" successfully" Feb 13 16:03:17.806217 containerd[1568]: time="2025-02-13T16:03:17.805510358Z" level=info msg="StopPodSandbox for \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\" returns successfully" Feb 13 16:03:17.806217 containerd[1568]: time="2025-02-13T16:03:17.805727156Z" level=info msg="RemovePodSandbox for \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\"" Feb 13 16:03:17.806217 containerd[1568]: time="2025-02-13T16:03:17.805740845Z" level=info msg="Forcibly stopping sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\"" Feb 13 16:03:17.806217 containerd[1568]: time="2025-02-13T16:03:17.805791345Z" level=info msg="TearDown network for sandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\" successfully" Feb 13 16:03:17.806995 containerd[1568]: time="2025-02-13T16:03:17.806978878Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.807037 containerd[1568]: time="2025-02-13T16:03:17.807006512Z" level=info msg="RemovePodSandbox \"bb6a0297efbba3923148dca766176e0bbe8e6bc09c36eefa0819c19ac70357f5\" returns successfully" Feb 13 16:03:17.807284 containerd[1568]: time="2025-02-13T16:03:17.807195915Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:03:17.807284 containerd[1568]: time="2025-02-13T16:03:17.807240662Z" level=info msg="TearDown network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" successfully" Feb 13 16:03:17.807284 containerd[1568]: time="2025-02-13T16:03:17.807248054Z" level=info msg="StopPodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" returns successfully" Feb 13 16:03:17.808339 containerd[1568]: time="2025-02-13T16:03:17.807501476Z" level=info msg="RemovePodSandbox for \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:03:17.808339 containerd[1568]: time="2025-02-13T16:03:17.807517111Z" level=info msg="Forcibly stopping sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\"" Feb 13 16:03:17.808339 containerd[1568]: time="2025-02-13T16:03:17.807554130Z" level=info msg="TearDown network for sandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" successfully" Feb 13 16:03:17.808790 containerd[1568]: time="2025-02-13T16:03:17.808778654Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.808863 containerd[1568]: time="2025-02-13T16:03:17.808853238Z" level=info msg="RemovePodSandbox \"a3b1b5761064068069d0397e59d85626a1cbe6f8ed4b498da46eff83c2b6f5d5\" returns successfully" Feb 13 16:03:17.809084 containerd[1568]: time="2025-02-13T16:03:17.809055872Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" Feb 13 16:03:17.809169 containerd[1568]: time="2025-02-13T16:03:17.809161082Z" level=info msg="TearDown network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" successfully" Feb 13 16:03:17.809202 containerd[1568]: time="2025-02-13T16:03:17.809195937Z" level=info msg="StopPodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" returns successfully" Feb 13 16:03:17.809347 containerd[1568]: time="2025-02-13T16:03:17.809332257Z" level=info msg="RemovePodSandbox for \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" Feb 13 16:03:17.809434 containerd[1568]: time="2025-02-13T16:03:17.809426149Z" level=info msg="Forcibly stopping sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\"" Feb 13 16:03:17.809517 containerd[1568]: time="2025-02-13T16:03:17.809497172Z" level=info msg="TearDown network for sandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" successfully" Feb 13 16:03:17.816768 containerd[1568]: time="2025-02-13T16:03:17.816749725Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.817006 containerd[1568]: time="2025-02-13T16:03:17.816995447Z" level=info msg="RemovePodSandbox \"611bf3620fc0ef1f86ea935a98107dae18ce5a779b3bb5177b863b63b1ad585c\" returns successfully" Feb 13 16:03:17.817267 containerd[1568]: time="2025-02-13T16:03:17.817255297Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\"" Feb 13 16:03:17.817360 containerd[1568]: time="2025-02-13T16:03:17.817351841Z" level=info msg="TearDown network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" successfully" Feb 13 16:03:17.817414 containerd[1568]: time="2025-02-13T16:03:17.817406691Z" level=info msg="StopPodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" returns successfully" Feb 13 16:03:17.817641 containerd[1568]: time="2025-02-13T16:03:17.817619827Z" level=info msg="RemovePodSandbox for \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\"" Feb 13 16:03:17.817641 containerd[1568]: time="2025-02-13T16:03:17.817636583Z" level=info msg="Forcibly stopping sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\"" Feb 13 16:03:17.817697 containerd[1568]: time="2025-02-13T16:03:17.817672736Z" level=info msg="TearDown network for sandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" successfully" Feb 13 16:03:17.830006 containerd[1568]: time="2025-02-13T16:03:17.829947729Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.830006 containerd[1568]: time="2025-02-13T16:03:17.829990410Z" level=info msg="RemovePodSandbox \"dc6781a84b948f67e65ce00f445d7ee350825621c08eaa9039af048c4381b5d6\" returns successfully" Feb 13 16:03:17.830682 containerd[1568]: time="2025-02-13T16:03:17.830448706Z" level=info msg="StopPodSandbox for \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\"" Feb 13 16:03:17.830682 containerd[1568]: time="2025-02-13T16:03:17.830504816Z" level=info msg="TearDown network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" successfully" Feb 13 16:03:17.830682 containerd[1568]: time="2025-02-13T16:03:17.830512214Z" level=info msg="StopPodSandbox for \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" returns successfully" Feb 13 16:03:17.837217 containerd[1568]: time="2025-02-13T16:03:17.830846269Z" level=info msg="RemovePodSandbox for \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\"" Feb 13 16:03:17.837217 containerd[1568]: time="2025-02-13T16:03:17.830860065Z" level=info msg="Forcibly stopping sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\"" Feb 13 16:03:17.837217 containerd[1568]: time="2025-02-13T16:03:17.830929568Z" level=info msg="TearDown network for sandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" successfully" Feb 13 16:03:17.846658 containerd[1568]: time="2025-02-13T16:03:17.846586897Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.846658 containerd[1568]: time="2025-02-13T16:03:17.846617672Z" level=info msg="RemovePodSandbox \"91ca847040139b09e313885c6e74df355c813c4f963f046f0bd9eeb6590ec6a9\" returns successfully" Feb 13 16:03:17.846866 containerd[1568]: time="2025-02-13T16:03:17.846849834Z" level=info msg="StopPodSandbox for \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\"" Feb 13 16:03:17.846949 containerd[1568]: time="2025-02-13T16:03:17.846933622Z" level=info msg="TearDown network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\" successfully" Feb 13 16:03:17.846949 containerd[1568]: time="2025-02-13T16:03:17.846944486Z" level=info msg="StopPodSandbox for \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\" returns successfully" Feb 13 16:03:17.847100 containerd[1568]: time="2025-02-13T16:03:17.847084508Z" level=info msg="RemovePodSandbox for \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\"" Feb 13 16:03:17.847100 containerd[1568]: time="2025-02-13T16:03:17.847096495Z" level=info msg="Forcibly stopping sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\"" Feb 13 16:03:17.847144 containerd[1568]: time="2025-02-13T16:03:17.847126373Z" level=info msg="TearDown network for sandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\" successfully" Feb 13 16:03:17.860072 containerd[1568]: time="2025-02-13T16:03:17.860045225Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.860130 containerd[1568]: time="2025-02-13T16:03:17.860081385Z" level=info msg="RemovePodSandbox \"c5fc77c20a01fc467a703073586cadfa3cf817f48de5b64aac8f4d58f8a9116a\" returns successfully" Feb 13 16:03:17.860501 containerd[1568]: time="2025-02-13T16:03:17.860320232Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:03:17.860501 containerd[1568]: time="2025-02-13T16:03:17.860425186Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:03:17.860501 containerd[1568]: time="2025-02-13T16:03:17.860435062Z" level=info msg="StopPodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:03:17.860794 containerd[1568]: time="2025-02-13T16:03:17.860768011Z" level=info msg="RemovePodSandbox for \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:03:17.860794 containerd[1568]: time="2025-02-13T16:03:17.860787283Z" level=info msg="Forcibly stopping sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\"" Feb 13 16:03:17.860862 containerd[1568]: time="2025-02-13T16:03:17.860833700Z" level=info msg="TearDown network for sandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" successfully" Feb 13 16:03:17.867000 containerd[1568]: time="2025-02-13T16:03:17.866974133Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.867065 containerd[1568]: time="2025-02-13T16:03:17.867002775Z" level=info msg="RemovePodSandbox \"0848f266b3e981acdba0168d63bb830f694728b07e187ce6a4938fe7d4fbe866\" returns successfully" Feb 13 16:03:17.867193 containerd[1568]: time="2025-02-13T16:03:17.867179721Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:03:17.867251 containerd[1568]: time="2025-02-13T16:03:17.867230540Z" level=info msg="TearDown network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" successfully" Feb 13 16:03:17.867251 containerd[1568]: time="2025-02-13T16:03:17.867241203Z" level=info msg="StopPodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" returns successfully" Feb 13 16:03:17.867969 containerd[1568]: time="2025-02-13T16:03:17.867420946Z" level=info msg="RemovePodSandbox for \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:03:17.867969 containerd[1568]: time="2025-02-13T16:03:17.867437625Z" level=info msg="Forcibly stopping sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\"" Feb 13 16:03:17.867969 containerd[1568]: time="2025-02-13T16:03:17.867481319Z" level=info msg="TearDown network for sandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" successfully" Feb 13 16:03:17.883488 containerd[1568]: time="2025-02-13T16:03:17.883466796Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.883568 containerd[1568]: time="2025-02-13T16:03:17.883497385Z" level=info msg="RemovePodSandbox \"932ef405f90a0d9133e58b215dae0db531d9de745475e5a8d5048e5f4dfb931a\" returns successfully" Feb 13 16:03:17.883823 containerd[1568]: time="2025-02-13T16:03:17.883728470Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" Feb 13 16:03:17.883823 containerd[1568]: time="2025-02-13T16:03:17.883785303Z" level=info msg="TearDown network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" successfully" Feb 13 16:03:17.883823 containerd[1568]: time="2025-02-13T16:03:17.883794131Z" level=info msg="StopPodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" returns successfully" Feb 13 16:03:17.884125 containerd[1568]: time="2025-02-13T16:03:17.884017389Z" level=info msg="RemovePodSandbox for \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" Feb 13 16:03:17.884125 containerd[1568]: time="2025-02-13T16:03:17.884029980Z" level=info msg="Forcibly stopping sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\"" Feb 13 16:03:17.884125 containerd[1568]: time="2025-02-13T16:03:17.884080844Z" level=info msg="TearDown network for sandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" successfully" Feb 13 16:03:17.885934 containerd[1568]: time="2025-02-13T16:03:17.885867583Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.885934 containerd[1568]: time="2025-02-13T16:03:17.885890816Z" level=info msg="RemovePodSandbox \"2af42d9f1c6c8a82de9e3b93680add5d485107656956a514a89613e43e01ec62\" returns successfully" Feb 13 16:03:17.886054 containerd[1568]: time="2025-02-13T16:03:17.886039334Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\"" Feb 13 16:03:17.886095 containerd[1568]: time="2025-02-13T16:03:17.886083482Z" level=info msg="TearDown network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" successfully" Feb 13 16:03:17.886095 containerd[1568]: time="2025-02-13T16:03:17.886091316Z" level=info msg="StopPodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" returns successfully" Feb 13 16:03:17.886317 containerd[1568]: time="2025-02-13T16:03:17.886303263Z" level=info msg="RemovePodSandbox for \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\"" Feb 13 16:03:17.886317 containerd[1568]: time="2025-02-13T16:03:17.886316865Z" level=info msg="Forcibly stopping sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\"" Feb 13 16:03:17.886377 containerd[1568]: time="2025-02-13T16:03:17.886347127Z" level=info msg="TearDown network for sandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" successfully" Feb 13 16:03:17.887480 containerd[1568]: time="2025-02-13T16:03:17.887466157Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.887511 containerd[1568]: time="2025-02-13T16:03:17.887486872Z" level=info msg="RemovePodSandbox \"9c1614ddcf0a9383601abc91e7def31710332c223d7ef7ed41146a7bebf349d9\" returns successfully" Feb 13 16:03:17.887647 containerd[1568]: time="2025-02-13T16:03:17.887632350Z" level=info msg="StopPodSandbox for \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\"" Feb 13 16:03:17.887783 containerd[1568]: time="2025-02-13T16:03:17.887676284Z" level=info msg="TearDown network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" successfully" Feb 13 16:03:17.887783 containerd[1568]: time="2025-02-13T16:03:17.887682526Z" level=info msg="StopPodSandbox for \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" returns successfully" Feb 13 16:03:17.887838 containerd[1568]: time="2025-02-13T16:03:17.887807815Z" level=info msg="RemovePodSandbox for \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\"" Feb 13 16:03:17.887838 containerd[1568]: time="2025-02-13T16:03:17.887819580Z" level=info msg="Forcibly stopping sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\"" Feb 13 16:03:17.887877 containerd[1568]: time="2025-02-13T16:03:17.887848012Z" level=info msg="TearDown network for sandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" successfully" Feb 13 16:03:17.889340 containerd[1568]: time="2025-02-13T16:03:17.889324835Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.889573 containerd[1568]: time="2025-02-13T16:03:17.889347549Z" level=info msg="RemovePodSandbox \"424c764557c06c4a42968995d61d896f7cca99ec9bb902e5e8c4687b21043d4e\" returns successfully" Feb 13 16:03:17.889573 containerd[1568]: time="2025-02-13T16:03:17.889482295Z" level=info msg="StopPodSandbox for \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\"" Feb 13 16:03:17.889573 containerd[1568]: time="2025-02-13T16:03:17.889527577Z" level=info msg="TearDown network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\" successfully" Feb 13 16:03:17.889573 containerd[1568]: time="2025-02-13T16:03:17.889533935Z" level=info msg="StopPodSandbox for \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\" returns successfully" Feb 13 16:03:17.889717 containerd[1568]: time="2025-02-13T16:03:17.889660239Z" level=info msg="RemovePodSandbox for \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\"" Feb 13 16:03:17.889740 containerd[1568]: time="2025-02-13T16:03:17.889729699Z" level=info msg="Forcibly stopping sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\"" Feb 13 16:03:17.889837 containerd[1568]: time="2025-02-13T16:03:17.889805898Z" level=info msg="TearDown network for sandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\" successfully" Feb 13 16:03:17.890996 containerd[1568]: time="2025-02-13T16:03:17.890981484Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.891037 containerd[1568]: time="2025-02-13T16:03:17.891004155Z" level=info msg="RemovePodSandbox \"041e3fb2fb8169eeef5ba1bf8f17d1b4a189b4d57f8e298b825ffacf0393b084\" returns successfully" Feb 13 16:03:17.891165 containerd[1568]: time="2025-02-13T16:03:17.891150497Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:03:17.891219 containerd[1568]: time="2025-02-13T16:03:17.891206185Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:03:17.891219 containerd[1568]: time="2025-02-13T16:03:17.891217255Z" level=info msg="StopPodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:03:17.892280 containerd[1568]: time="2025-02-13T16:03:17.891431579Z" level=info msg="RemovePodSandbox for \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:03:17.892280 containerd[1568]: time="2025-02-13T16:03:17.891445055Z" level=info msg="Forcibly stopping sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\"" Feb 13 16:03:17.892280 containerd[1568]: time="2025-02-13T16:03:17.891477834Z" level=info msg="TearDown network for sandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" successfully" Feb 13 16:03:17.892733 containerd[1568]: time="2025-02-13T16:03:17.892719923Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.892793 containerd[1568]: time="2025-02-13T16:03:17.892784002Z" level=info msg="RemovePodSandbox \"5ef632e0ec4f01258f80e1bed122fea3877d83f352539db312cf57a5a48ba4bd\" returns successfully" Feb 13 16:03:17.893088 containerd[1568]: time="2025-02-13T16:03:17.893073389Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:03:17.893119 containerd[1568]: time="2025-02-13T16:03:17.893115092Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:03:17.893150 containerd[1568]: time="2025-02-13T16:03:17.893121344Z" level=info msg="StopPodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:03:17.893307 containerd[1568]: time="2025-02-13T16:03:17.893290572Z" level=info msg="RemovePodSandbox for \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:03:17.893853 containerd[1568]: time="2025-02-13T16:03:17.893307353Z" level=info msg="Forcibly stopping sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\"" Feb 13 16:03:17.893853 containerd[1568]: time="2025-02-13T16:03:17.893339627Z" level=info msg="TearDown network for sandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" successfully" Feb 13 16:03:17.895431 containerd[1568]: time="2025-02-13T16:03:17.895415557Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.895474 containerd[1568]: time="2025-02-13T16:03:17.895442403Z" level=info msg="RemovePodSandbox \"a3c50b69d231448bf1d3069660ecfa76351e37a68139abe736bfd40c1f5590f7\" returns successfully" Feb 13 16:03:17.895666 containerd[1568]: time="2025-02-13T16:03:17.895608184Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:03:17.895785 containerd[1568]: time="2025-02-13T16:03:17.895748567Z" level=info msg="TearDown network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" successfully" Feb 13 16:03:17.895785 containerd[1568]: time="2025-02-13T16:03:17.895758142Z" level=info msg="StopPodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" returns successfully" Feb 13 16:03:17.896468 containerd[1568]: time="2025-02-13T16:03:17.896037884Z" level=info msg="RemovePodSandbox for \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:03:17.896468 containerd[1568]: time="2025-02-13T16:03:17.896053267Z" level=info msg="Forcibly stopping sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\"" Feb 13 16:03:17.896468 containerd[1568]: time="2025-02-13T16:03:17.896094930Z" level=info msg="TearDown network for sandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" successfully" Feb 13 16:03:17.897841 containerd[1568]: time="2025-02-13T16:03:17.897751994Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.897841 containerd[1568]: time="2025-02-13T16:03:17.897778590Z" level=info msg="RemovePodSandbox \"348c354064ceb14ed79e10662fa7efd85a84bd4fb49edc61953adab0e6f1b6f4\" returns successfully" Feb 13 16:03:17.898326 containerd[1568]: time="2025-02-13T16:03:17.898311747Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" Feb 13 16:03:17.898466 containerd[1568]: time="2025-02-13T16:03:17.898360296Z" level=info msg="TearDown network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" successfully" Feb 13 16:03:17.898466 containerd[1568]: time="2025-02-13T16:03:17.898382288Z" level=info msg="StopPodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" returns successfully" Feb 13 16:03:17.898855 containerd[1568]: time="2025-02-13T16:03:17.898841126Z" level=info msg="RemovePodSandbox for \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" Feb 13 16:03:17.898885 containerd[1568]: time="2025-02-13T16:03:17.898856509Z" level=info msg="Forcibly stopping sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\"" Feb 13 16:03:17.899085 containerd[1568]: time="2025-02-13T16:03:17.898890092Z" level=info msg="TearDown network for sandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" successfully" Feb 13 16:03:17.900988 containerd[1568]: time="2025-02-13T16:03:17.900964010Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.901060 containerd[1568]: time="2025-02-13T16:03:17.901013856Z" level=info msg="RemovePodSandbox \"6ebc31d49011f98dd486577b2b4de4cb0dac7b7b3cd2c5c2f154b0c1cb88eccf\" returns successfully" Feb 13 16:03:17.901328 containerd[1568]: time="2025-02-13T16:03:17.901311831Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\"" Feb 13 16:03:17.901403 containerd[1568]: time="2025-02-13T16:03:17.901386065Z" level=info msg="TearDown network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" successfully" Feb 13 16:03:17.901403 containerd[1568]: time="2025-02-13T16:03:17.901399026Z" level=info msg="StopPodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" returns successfully" Feb 13 16:03:17.901578 containerd[1568]: time="2025-02-13T16:03:17.901560017Z" level=info msg="RemovePodSandbox for \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\"" Feb 13 16:03:17.901602 containerd[1568]: time="2025-02-13T16:03:17.901578534Z" level=info msg="Forcibly stopping sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\"" Feb 13 16:03:17.901658 containerd[1568]: time="2025-02-13T16:03:17.901629658Z" level=info msg="TearDown network for sandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" successfully" Feb 13 16:03:17.903168 containerd[1568]: time="2025-02-13T16:03:17.903147269Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.903218 containerd[1568]: time="2025-02-13T16:03:17.903179251Z" level=info msg="RemovePodSandbox \"db30f17be5af55fc3e0d17686c95d830a5be87f9afde62516957a132e607e5ff\" returns successfully" Feb 13 16:03:17.903505 containerd[1568]: time="2025-02-13T16:03:17.903397206Z" level=info msg="StopPodSandbox for \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\"" Feb 13 16:03:17.903505 containerd[1568]: time="2025-02-13T16:03:17.903461272Z" level=info msg="TearDown network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" successfully" Feb 13 16:03:17.903505 containerd[1568]: time="2025-02-13T16:03:17.903470547Z" level=info msg="StopPodSandbox for \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" returns successfully" Feb 13 16:03:17.903627 containerd[1568]: time="2025-02-13T16:03:17.903606170Z" level=info msg="RemovePodSandbox for \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\"" Feb 13 16:03:17.903627 containerd[1568]: time="2025-02-13T16:03:17.903620887Z" level=info msg="Forcibly stopping sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\"" Feb 13 16:03:17.903683 containerd[1568]: time="2025-02-13T16:03:17.903660801Z" level=info msg="TearDown network for sandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" successfully" Feb 13 16:03:17.904979 containerd[1568]: time="2025-02-13T16:03:17.904960210Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.905075 containerd[1568]: time="2025-02-13T16:03:17.904993643Z" level=info msg="RemovePodSandbox \"1bbf58cd374fc544459e7359e6b165f7dc1ff4a1908b7cb1681674f4c2c2d910\" returns successfully" Feb 13 16:03:17.905407 containerd[1568]: time="2025-02-13T16:03:17.905248022Z" level=info msg="StopPodSandbox for \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\"" Feb 13 16:03:17.905407 containerd[1568]: time="2025-02-13T16:03:17.905297223Z" level=info msg="TearDown network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\" successfully" Feb 13 16:03:17.905407 containerd[1568]: time="2025-02-13T16:03:17.905304678Z" level=info msg="StopPodSandbox for \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\" returns successfully" Feb 13 16:03:17.905758 containerd[1568]: time="2025-02-13T16:03:17.905655004Z" level=info msg="RemovePodSandbox for \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\"" Feb 13 16:03:17.905758 containerd[1568]: time="2025-02-13T16:03:17.905668447Z" level=info msg="Forcibly stopping sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\"" Feb 13 16:03:17.905758 containerd[1568]: time="2025-02-13T16:03:17.905714485Z" level=info msg="TearDown network for sandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\" successfully" Feb 13 16:03:17.907534 containerd[1568]: time="2025-02-13T16:03:17.907421540Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:03:17.907534 containerd[1568]: time="2025-02-13T16:03:17.907455659Z" level=info msg="RemovePodSandbox \"ac2a4073417629432399591da40c9e4fe45e9b5ba6e39289fd67e6be2d3b100e\" returns successfully" Feb 13 16:03:27.092067 kubelet[2825]: I0213 16:03:27.091973 2825 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:03:31.219118 systemd[1]: Started sshd@7-139.178.70.107:22-147.75.109.163:54998.service - OpenSSH per-connection server daemon (147.75.109.163:54998). Feb 13 16:03:31.313237 sshd[5857]: Accepted publickey for core from 147.75.109.163 port 54998 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:31.315740 sshd-session[5857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:31.321951 systemd-logind[1551]: New session 10 of user core. Feb 13 16:03:31.330011 systemd[1]: Started session-10.scope - Session 10 of User core. Feb 13 16:03:31.775825 sshd[5859]: Connection closed by 147.75.109.163 port 54998 Feb 13 16:03:31.776194 sshd-session[5857]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:31.781854 systemd[1]: sshd@7-139.178.70.107:22-147.75.109.163:54998.service: Deactivated successfully. Feb 13 16:03:31.783104 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 16:03:31.783551 systemd-logind[1551]: Session 10 logged out. Waiting for processes to exit. Feb 13 16:03:31.784105 systemd-logind[1551]: Removed session 10. Feb 13 16:03:36.787535 systemd[1]: Started sshd@8-139.178.70.107:22-147.75.109.163:55010.service - OpenSSH per-connection server daemon (147.75.109.163:55010). Feb 13 16:03:36.838503 sshd[5872]: Accepted publickey for core from 147.75.109.163 port 55010 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:36.839327 sshd-session[5872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:36.842346 systemd-logind[1551]: New session 11 of user core. Feb 13 16:03:36.849056 systemd[1]: Started session-11.scope - Session 11 of User core. Feb 13 16:03:37.185544 sshd[5874]: Connection closed by 147.75.109.163 port 55010 Feb 13 16:03:37.186023 sshd-session[5872]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:37.188022 systemd-logind[1551]: Session 11 logged out. Waiting for processes to exit. Feb 13 16:03:37.189369 systemd[1]: sshd@8-139.178.70.107:22-147.75.109.163:55010.service: Deactivated successfully. Feb 13 16:03:37.190401 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 16:03:37.192088 systemd-logind[1551]: Removed session 11. Feb 13 16:03:42.199325 systemd[1]: Started sshd@9-139.178.70.107:22-147.75.109.163:45138.service - OpenSSH per-connection server daemon (147.75.109.163:45138). Feb 13 16:03:42.256765 sshd[5908]: Accepted publickey for core from 147.75.109.163 port 45138 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:42.258063 sshd-session[5908]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:42.263939 systemd-logind[1551]: New session 12 of user core. Feb 13 16:03:42.268026 systemd[1]: Started session-12.scope - Session 12 of User core. Feb 13 16:03:42.378929 sshd[5919]: Connection closed by 147.75.109.163 port 45138 Feb 13 16:03:42.379428 sshd-session[5908]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:42.385416 systemd[1]: sshd@9-139.178.70.107:22-147.75.109.163:45138.service: Deactivated successfully. Feb 13 16:03:42.386329 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 16:03:42.387144 systemd-logind[1551]: Session 12 logged out. Waiting for processes to exit. Feb 13 16:03:42.392550 systemd[1]: Started sshd@10-139.178.70.107:22-147.75.109.163:45154.service - OpenSSH per-connection server daemon (147.75.109.163:45154). Feb 13 16:03:42.393538 systemd-logind[1551]: Removed session 12. Feb 13 16:03:42.424112 sshd[5941]: Accepted publickey for core from 147.75.109.163 port 45154 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:42.425002 sshd-session[5941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:42.429948 systemd-logind[1551]: New session 13 of user core. Feb 13 16:03:42.433080 systemd[1]: Started session-13.scope - Session 13 of User core. Feb 13 16:03:42.575732 sshd[5944]: Connection closed by 147.75.109.163 port 45154 Feb 13 16:03:42.577969 sshd-session[5941]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:42.591989 systemd[1]: Started sshd@11-139.178.70.107:22-147.75.109.163:45170.service - OpenSSH per-connection server daemon (147.75.109.163:45170). Feb 13 16:03:42.592443 systemd[1]: sshd@10-139.178.70.107:22-147.75.109.163:45154.service: Deactivated successfully. Feb 13 16:03:42.593669 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 16:03:42.598494 systemd-logind[1551]: Session 13 logged out. Waiting for processes to exit. Feb 13 16:03:42.602495 systemd-logind[1551]: Removed session 13. Feb 13 16:03:42.635440 sshd[5951]: Accepted publickey for core from 147.75.109.163 port 45170 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:42.636871 sshd-session[5951]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:42.640451 systemd-logind[1551]: New session 14 of user core. Feb 13 16:03:42.651068 systemd[1]: Started session-14.scope - Session 14 of User core. Feb 13 16:03:42.790837 sshd[5956]: Connection closed by 147.75.109.163 port 45170 Feb 13 16:03:42.791230 sshd-session[5951]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:42.793527 systemd[1]: sshd@11-139.178.70.107:22-147.75.109.163:45170.service: Deactivated successfully. Feb 13 16:03:42.794603 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 16:03:42.795065 systemd-logind[1551]: Session 14 logged out. Waiting for processes to exit. Feb 13 16:03:42.795679 systemd-logind[1551]: Removed session 14. Feb 13 16:03:47.801026 systemd[1]: Started sshd@12-139.178.70.107:22-147.75.109.163:45182.service - OpenSSH per-connection server daemon (147.75.109.163:45182). Feb 13 16:03:47.858351 sshd[5993]: Accepted publickey for core from 147.75.109.163 port 45182 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:47.859093 sshd-session[5993]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:47.862299 systemd-logind[1551]: New session 15 of user core. Feb 13 16:03:47.871989 systemd[1]: Started session-15.scope - Session 15 of User core. Feb 13 16:03:47.963076 sshd[5996]: Connection closed by 147.75.109.163 port 45182 Feb 13 16:03:47.963024 sshd-session[5993]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:47.966802 systemd[1]: sshd@12-139.178.70.107:22-147.75.109.163:45182.service: Deactivated successfully. Feb 13 16:03:47.968257 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 16:03:47.969697 systemd-logind[1551]: Session 15 logged out. Waiting for processes to exit. Feb 13 16:03:47.970942 systemd-logind[1551]: Removed session 15. Feb 13 16:03:52.979289 systemd[1]: Started sshd@13-139.178.70.107:22-147.75.109.163:42944.service - OpenSSH per-connection server daemon (147.75.109.163:42944). Feb 13 16:03:53.016474 sshd[6007]: Accepted publickey for core from 147.75.109.163 port 42944 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:53.017602 sshd-session[6007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:53.020729 systemd-logind[1551]: New session 16 of user core. Feb 13 16:03:53.028024 systemd[1]: Started session-16.scope - Session 16 of User core. Feb 13 16:03:53.142039 sshd[6009]: Connection closed by 147.75.109.163 port 42944 Feb 13 16:03:53.143039 sshd-session[6007]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:53.149654 systemd[1]: sshd@13-139.178.70.107:22-147.75.109.163:42944.service: Deactivated successfully. Feb 13 16:03:53.150717 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 16:03:53.151301 systemd-logind[1551]: Session 16 logged out. Waiting for processes to exit. Feb 13 16:03:53.155071 systemd[1]: Started sshd@14-139.178.70.107:22-147.75.109.163:42950.service - OpenSSH per-connection server daemon (147.75.109.163:42950). Feb 13 16:03:53.156170 systemd-logind[1551]: Removed session 16. Feb 13 16:03:53.188291 sshd[6020]: Accepted publickey for core from 147.75.109.163 port 42950 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:53.189213 sshd-session[6020]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:53.192946 systemd-logind[1551]: New session 17 of user core. Feb 13 16:03:53.203102 systemd[1]: Started session-17.scope - Session 17 of User core. Feb 13 16:03:53.555934 sshd[6023]: Connection closed by 147.75.109.163 port 42950 Feb 13 16:03:53.565386 systemd[1]: Started sshd@15-139.178.70.107:22-147.75.109.163:42962.service - OpenSSH per-connection server daemon (147.75.109.163:42962). Feb 13 16:03:53.565886 sshd-session[6020]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:53.568353 systemd[1]: sshd@14-139.178.70.107:22-147.75.109.163:42950.service: Deactivated successfully. Feb 13 16:03:53.568513 systemd-logind[1551]: Session 17 logged out. Waiting for processes to exit. Feb 13 16:03:53.569665 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 16:03:53.570630 systemd-logind[1551]: Removed session 17. Feb 13 16:03:53.722723 sshd[6032]: Accepted publickey for core from 147.75.109.163 port 42962 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:53.733446 sshd-session[6032]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:53.737338 systemd-logind[1551]: New session 18 of user core. Feb 13 16:03:53.742005 systemd[1]: Started session-18.scope - Session 18 of User core. Feb 13 16:03:55.303582 sshd[6037]: Connection closed by 147.75.109.163 port 42962 Feb 13 16:03:55.314408 systemd[1]: Started sshd@16-139.178.70.107:22-147.75.109.163:42978.service - OpenSSH per-connection server daemon (147.75.109.163:42978). Feb 13 16:03:55.318552 sshd-session[6032]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:55.330372 systemd-logind[1551]: Session 18 logged out. Waiting for processes to exit. Feb 13 16:03:55.331058 systemd[1]: sshd@15-139.178.70.107:22-147.75.109.163:42962.service: Deactivated successfully. Feb 13 16:03:55.332574 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 16:03:55.332728 systemd[1]: session-18.scope: Consumed 349ms CPU time, 70.1M memory peak. Feb 13 16:03:55.333421 systemd-logind[1551]: Removed session 18. Feb 13 16:03:55.422456 sshd[6051]: Accepted publickey for core from 147.75.109.163 port 42978 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:55.423234 sshd-session[6051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:55.426428 systemd-logind[1551]: New session 19 of user core. Feb 13 16:03:55.431040 systemd[1]: Started session-19.scope - Session 19 of User core. Feb 13 16:03:56.059748 sshd[6056]: Connection closed by 147.75.109.163 port 42978 Feb 13 16:03:56.060311 sshd-session[6051]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:56.068841 systemd[1]: sshd@16-139.178.70.107:22-147.75.109.163:42978.service: Deactivated successfully. Feb 13 16:03:56.070680 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 16:03:56.076094 systemd-logind[1551]: Session 19 logged out. Waiting for processes to exit. Feb 13 16:03:56.080169 systemd[1]: Started sshd@17-139.178.70.107:22-147.75.109.163:42982.service - OpenSSH per-connection server daemon (147.75.109.163:42982). Feb 13 16:03:56.082218 systemd-logind[1551]: Removed session 19. Feb 13 16:03:56.131431 sshd[6066]: Accepted publickey for core from 147.75.109.163 port 42982 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:56.132613 sshd-session[6066]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:56.135643 systemd-logind[1551]: New session 20 of user core. Feb 13 16:03:56.138980 systemd[1]: Started session-20.scope - Session 20 of User core. Feb 13 16:03:56.240567 sshd[6069]: Connection closed by 147.75.109.163 port 42982 Feb 13 16:03:56.240981 sshd-session[6066]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:56.242969 systemd[1]: sshd@17-139.178.70.107:22-147.75.109.163:42982.service: Deactivated successfully. Feb 13 16:03:56.243949 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 16:03:56.244451 systemd-logind[1551]: Session 20 logged out. Waiting for processes to exit. Feb 13 16:03:56.245031 systemd-logind[1551]: Removed session 20. Feb 13 16:04:01.252410 systemd[1]: Started sshd@18-139.178.70.107:22-147.75.109.163:47248.service - OpenSSH per-connection server daemon (147.75.109.163:47248). Feb 13 16:04:01.289226 sshd[6084]: Accepted publickey for core from 147.75.109.163 port 47248 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:04:01.290372 sshd-session[6084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:04:01.293305 systemd-logind[1551]: New session 21 of user core. Feb 13 16:04:01.303011 systemd[1]: Started session-21.scope - Session 21 of User core. Feb 13 16:04:01.396120 sshd[6086]: Connection closed by 147.75.109.163 port 47248 Feb 13 16:04:01.395811 sshd-session[6084]: pam_unix(sshd:session): session closed for user core Feb 13 16:04:01.397572 systemd[1]: sshd@18-139.178.70.107:22-147.75.109.163:47248.service: Deactivated successfully. Feb 13 16:04:01.398858 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 16:04:01.399941 systemd-logind[1551]: Session 21 logged out. Waiting for processes to exit. Feb 13 16:04:01.400477 systemd-logind[1551]: Removed session 21. Feb 13 16:04:06.406806 systemd[1]: Started sshd@19-139.178.70.107:22-147.75.109.163:47260.service - OpenSSH per-connection server daemon (147.75.109.163:47260). Feb 13 16:04:06.613968 sshd[6098]: Accepted publickey for core from 147.75.109.163 port 47260 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:04:06.615259 sshd-session[6098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:04:06.618448 systemd-logind[1551]: New session 22 of user core. Feb 13 16:04:06.621988 systemd[1]: Started session-22.scope - Session 22 of User core. Feb 13 16:04:06.749766 sshd[6100]: Connection closed by 147.75.109.163 port 47260 Feb 13 16:04:06.750348 sshd-session[6098]: pam_unix(sshd:session): session closed for user core Feb 13 16:04:06.752296 systemd[1]: sshd@19-139.178.70.107:22-147.75.109.163:47260.service: Deactivated successfully. Feb 13 16:04:06.753486 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 16:04:06.754615 systemd-logind[1551]: Session 22 logged out. Waiting for processes to exit. Feb 13 16:04:06.755155 systemd-logind[1551]: Removed session 22. Feb 13 16:04:11.764094 systemd[1]: Started sshd@20-139.178.70.107:22-147.75.109.163:56296.service - OpenSSH per-connection server daemon (147.75.109.163:56296). Feb 13 16:04:12.060132 sshd[6143]: Accepted publickey for core from 147.75.109.163 port 56296 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:04:12.066326 sshd-session[6143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:04:12.072944 systemd-logind[1551]: New session 23 of user core. Feb 13 16:04:12.084006 systemd[1]: Started session-23.scope - Session 23 of User core. Feb 13 16:04:12.333616 sshd[6155]: Connection closed by 147.75.109.163 port 56296 Feb 13 16:04:12.334066 sshd-session[6143]: pam_unix(sshd:session): session closed for user core Feb 13 16:04:12.335730 systemd-logind[1551]: Session 23 logged out. Waiting for processes to exit. Feb 13 16:04:12.336216 systemd[1]: sshd@20-139.178.70.107:22-147.75.109.163:56296.service: Deactivated successfully. Feb 13 16:04:12.337596 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 16:04:12.338884 systemd-logind[1551]: Removed session 23.