Feb 13 16:01:56.751958 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 14:00:20 -00 2025 Feb 13 16:01:56.751975 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 16:01:56.751982 kernel: Disabled fast string operations Feb 13 16:01:56.751987 kernel: BIOS-provided physical RAM map: Feb 13 16:01:56.751991 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Feb 13 16:01:56.751996 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Feb 13 16:01:56.752003 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Feb 13 16:01:56.752007 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Feb 13 16:01:56.752012 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Feb 13 16:01:56.752017 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Feb 13 16:01:56.752021 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Feb 13 16:01:56.752026 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Feb 13 16:01:56.752031 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Feb 13 16:01:56.752035 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 16:01:56.752042 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Feb 13 16:01:56.752048 kernel: NX (Execute Disable) protection: active Feb 13 16:01:56.752053 kernel: APIC: Static calls initialized Feb 13 16:01:56.752058 kernel: SMBIOS 2.7 present. Feb 13 16:01:56.752064 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Feb 13 16:01:56.752069 kernel: vmware: hypercall mode: 0x00 Feb 13 16:01:56.752074 kernel: Hypervisor detected: VMware Feb 13 16:01:56.752080 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Feb 13 16:01:56.752086 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Feb 13 16:01:56.752091 kernel: vmware: using clock offset of 2381206691 ns Feb 13 16:01:56.752096 kernel: tsc: Detected 3408.000 MHz processor Feb 13 16:01:56.752102 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 16:01:56.752108 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 16:01:56.752113 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Feb 13 16:01:56.752118 kernel: total RAM covered: 3072M Feb 13 16:01:56.752124 kernel: Found optimal setting for mtrr clean up Feb 13 16:01:56.752131 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Feb 13 16:01:56.752137 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Feb 13 16:01:56.752144 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 16:01:56.752150 kernel: Using GB pages for direct mapping Feb 13 16:01:56.752155 kernel: ACPI: Early table checksum verification disabled Feb 13 16:01:56.752160 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Feb 13 16:01:56.752165 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Feb 13 16:01:56.752171 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Feb 13 16:01:56.752176 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Feb 13 16:01:56.752182 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 16:01:56.752190 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 16:01:56.752196 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Feb 13 16:01:56.752202 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Feb 13 16:01:56.752207 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Feb 13 16:01:56.752213 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Feb 13 16:01:56.752219 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Feb 13 16:01:56.752225 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Feb 13 16:01:56.752231 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Feb 13 16:01:56.752237 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Feb 13 16:01:56.752242 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 16:01:56.752248 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 16:01:56.752254 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Feb 13 16:01:56.752260 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Feb 13 16:01:56.752265 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Feb 13 16:01:56.752271 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Feb 13 16:01:56.752278 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Feb 13 16:01:56.752283 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Feb 13 16:01:56.752289 kernel: system APIC only can use physical flat Feb 13 16:01:56.752294 kernel: APIC: Switched APIC routing to: physical flat Feb 13 16:01:56.752300 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 16:01:56.752305 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Feb 13 16:01:56.752311 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Feb 13 16:01:56.752316 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Feb 13 16:01:56.752322 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Feb 13 16:01:56.752327 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Feb 13 16:01:56.752334 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Feb 13 16:01:56.752339 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Feb 13 16:01:56.752345 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Feb 13 16:01:56.752350 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Feb 13 16:01:56.752369 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Feb 13 16:01:56.752374 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Feb 13 16:01:56.752379 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Feb 13 16:01:56.752384 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Feb 13 16:01:56.752389 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Feb 13 16:01:56.752394 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Feb 13 16:01:56.752400 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Feb 13 16:01:56.752405 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Feb 13 16:01:56.752410 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Feb 13 16:01:56.752415 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Feb 13 16:01:56.752420 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Feb 13 16:01:56.752425 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Feb 13 16:01:56.752430 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Feb 13 16:01:56.752435 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Feb 13 16:01:56.752583 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Feb 13 16:01:56.752591 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Feb 13 16:01:56.752599 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Feb 13 16:01:56.752604 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Feb 13 16:01:56.752609 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Feb 13 16:01:56.752614 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Feb 13 16:01:56.752619 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Feb 13 16:01:56.752624 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Feb 13 16:01:56.752629 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Feb 13 16:01:56.752634 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Feb 13 16:01:56.752639 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Feb 13 16:01:56.752644 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Feb 13 16:01:56.752650 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Feb 13 16:01:56.752655 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Feb 13 16:01:56.752660 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Feb 13 16:01:56.752665 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Feb 13 16:01:56.752670 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Feb 13 16:01:56.752675 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Feb 13 16:01:56.752680 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Feb 13 16:01:56.752685 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Feb 13 16:01:56.752690 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Feb 13 16:01:56.752695 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Feb 13 16:01:56.752701 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Feb 13 16:01:56.752706 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Feb 13 16:01:56.752711 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Feb 13 16:01:56.752716 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Feb 13 16:01:56.752721 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Feb 13 16:01:56.752726 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Feb 13 16:01:56.752731 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Feb 13 16:01:56.752736 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Feb 13 16:01:56.752741 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Feb 13 16:01:56.752746 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Feb 13 16:01:56.752752 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Feb 13 16:01:56.752757 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Feb 13 16:01:56.752762 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Feb 13 16:01:56.752771 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Feb 13 16:01:56.752777 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Feb 13 16:01:56.752782 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Feb 13 16:01:56.752787 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Feb 13 16:01:56.752793 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Feb 13 16:01:56.752798 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Feb 13 16:01:56.752804 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Feb 13 16:01:56.752810 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Feb 13 16:01:56.752815 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Feb 13 16:01:56.752820 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Feb 13 16:01:56.752826 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Feb 13 16:01:56.752831 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Feb 13 16:01:56.752836 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Feb 13 16:01:56.752842 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Feb 13 16:01:56.752847 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Feb 13 16:01:56.752852 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Feb 13 16:01:56.752858 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Feb 13 16:01:56.752864 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Feb 13 16:01:56.752869 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Feb 13 16:01:56.752875 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Feb 13 16:01:56.752880 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Feb 13 16:01:56.752885 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Feb 13 16:01:56.752891 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Feb 13 16:01:56.752896 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Feb 13 16:01:56.752901 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Feb 13 16:01:56.752906 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Feb 13 16:01:56.752913 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Feb 13 16:01:56.752918 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Feb 13 16:01:56.752923 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Feb 13 16:01:56.752929 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Feb 13 16:01:56.752934 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Feb 13 16:01:56.752939 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Feb 13 16:01:56.752945 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Feb 13 16:01:56.752950 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Feb 13 16:01:56.752955 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Feb 13 16:01:56.752960 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Feb 13 16:01:56.752967 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Feb 13 16:01:56.752972 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Feb 13 16:01:56.752977 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Feb 13 16:01:56.752983 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Feb 13 16:01:56.752988 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Feb 13 16:01:56.752993 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Feb 13 16:01:56.752998 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Feb 13 16:01:56.753004 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Feb 13 16:01:56.753009 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Feb 13 16:01:56.753014 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Feb 13 16:01:56.753020 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Feb 13 16:01:56.753026 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Feb 13 16:01:56.753032 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Feb 13 16:01:56.753037 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Feb 13 16:01:56.753042 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Feb 13 16:01:56.753047 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Feb 13 16:01:56.753053 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Feb 13 16:01:56.753058 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Feb 13 16:01:56.753063 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Feb 13 16:01:56.753069 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Feb 13 16:01:56.753074 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Feb 13 16:01:56.753080 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Feb 13 16:01:56.753086 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Feb 13 16:01:56.753091 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Feb 13 16:01:56.753096 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Feb 13 16:01:56.753102 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Feb 13 16:01:56.753107 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Feb 13 16:01:56.753112 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Feb 13 16:01:56.753117 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Feb 13 16:01:56.753123 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Feb 13 16:01:56.753128 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Feb 13 16:01:56.753134 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Feb 13 16:01:56.753140 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Feb 13 16:01:56.753145 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 16:01:56.753151 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 16:01:56.753156 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Feb 13 16:01:56.753162 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Feb 13 16:01:56.753167 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Feb 13 16:01:56.753173 kernel: Zone ranges: Feb 13 16:01:56.753179 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 16:01:56.753185 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Feb 13 16:01:56.753190 kernel: Normal empty Feb 13 16:01:56.753196 kernel: Movable zone start for each node Feb 13 16:01:56.753201 kernel: Early memory node ranges Feb 13 16:01:56.753206 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Feb 13 16:01:56.753212 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Feb 13 16:01:56.753217 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Feb 13 16:01:56.753223 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Feb 13 16:01:56.753228 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 16:01:56.753234 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Feb 13 16:01:56.753240 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Feb 13 16:01:56.753246 kernel: ACPI: PM-Timer IO Port: 0x1008 Feb 13 16:01:56.753505 kernel: system APIC only can use physical flat Feb 13 16:01:56.753511 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Feb 13 16:01:56.753517 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 16:01:56.753522 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 16:01:56.753528 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 16:01:56.753533 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 16:01:56.753538 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 16:01:56.753544 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 16:01:56.753552 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 16:01:56.753557 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 16:01:56.753562 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 16:01:56.753568 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 16:01:56.753573 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 16:01:56.753579 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 16:01:56.753584 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 16:01:56.753589 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 16:01:56.753595 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 16:01:56.753601 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 16:01:56.753607 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Feb 13 16:01:56.753612 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Feb 13 16:01:56.753617 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Feb 13 16:01:56.753623 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Feb 13 16:01:56.753628 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Feb 13 16:01:56.753634 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Feb 13 16:01:56.753639 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Feb 13 16:01:56.753644 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Feb 13 16:01:56.753650 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Feb 13 16:01:56.753656 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Feb 13 16:01:56.753662 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Feb 13 16:01:56.753667 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Feb 13 16:01:56.753672 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Feb 13 16:01:56.753677 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Feb 13 16:01:56.753683 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Feb 13 16:01:56.753688 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Feb 13 16:01:56.753694 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Feb 13 16:01:56.753699 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Feb 13 16:01:56.753705 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Feb 13 16:01:56.753711 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Feb 13 16:01:56.753716 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Feb 13 16:01:56.753722 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Feb 13 16:01:56.753727 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Feb 13 16:01:56.753732 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Feb 13 16:01:56.753738 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Feb 13 16:01:56.753743 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Feb 13 16:01:56.753748 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Feb 13 16:01:56.753754 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Feb 13 16:01:56.753760 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Feb 13 16:01:56.753766 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Feb 13 16:01:56.753771 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Feb 13 16:01:56.753776 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Feb 13 16:01:56.753782 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Feb 13 16:01:56.753787 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Feb 13 16:01:56.753792 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Feb 13 16:01:56.753798 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Feb 13 16:01:56.753803 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Feb 13 16:01:56.753808 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Feb 13 16:01:56.753815 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Feb 13 16:01:56.753820 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Feb 13 16:01:56.753825 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Feb 13 16:01:56.753831 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Feb 13 16:01:56.753836 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Feb 13 16:01:56.753841 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Feb 13 16:01:56.753847 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Feb 13 16:01:56.753852 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Feb 13 16:01:56.753858 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Feb 13 16:01:56.753863 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Feb 13 16:01:56.753869 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Feb 13 16:01:56.753875 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Feb 13 16:01:56.753880 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Feb 13 16:01:56.753885 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Feb 13 16:01:56.753891 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Feb 13 16:01:56.753896 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Feb 13 16:01:56.753901 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Feb 13 16:01:56.753907 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Feb 13 16:01:56.753912 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Feb 13 16:01:56.753919 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Feb 13 16:01:56.753924 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Feb 13 16:01:56.753929 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Feb 13 16:01:56.753935 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Feb 13 16:01:56.753940 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Feb 13 16:01:56.753946 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Feb 13 16:01:56.753951 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Feb 13 16:01:56.753956 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Feb 13 16:01:56.753962 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Feb 13 16:01:56.753967 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Feb 13 16:01:56.753973 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Feb 13 16:01:56.753979 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Feb 13 16:01:56.753984 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Feb 13 16:01:56.753990 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Feb 13 16:01:56.753995 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Feb 13 16:01:56.754000 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Feb 13 16:01:56.754006 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Feb 13 16:01:56.754011 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Feb 13 16:01:56.754016 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Feb 13 16:01:56.754022 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Feb 13 16:01:56.754028 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Feb 13 16:01:56.754034 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Feb 13 16:01:56.754039 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Feb 13 16:01:56.754044 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Feb 13 16:01:56.754050 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Feb 13 16:01:56.754055 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Feb 13 16:01:56.754060 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Feb 13 16:01:56.754066 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Feb 13 16:01:56.754071 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Feb 13 16:01:56.754077 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Feb 13 16:01:56.754083 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Feb 13 16:01:56.754088 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Feb 13 16:01:56.754093 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Feb 13 16:01:56.754099 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Feb 13 16:01:56.754104 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Feb 13 16:01:56.754109 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Feb 13 16:01:56.754125 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Feb 13 16:01:56.754132 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Feb 13 16:01:56.754137 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Feb 13 16:01:56.754152 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Feb 13 16:01:56.754159 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Feb 13 16:01:56.754164 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Feb 13 16:01:56.754170 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Feb 13 16:01:56.754175 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Feb 13 16:01:56.754180 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Feb 13 16:01:56.754188 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Feb 13 16:01:56.754194 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Feb 13 16:01:56.754199 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Feb 13 16:01:56.754205 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Feb 13 16:01:56.754211 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Feb 13 16:01:56.754217 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Feb 13 16:01:56.754222 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Feb 13 16:01:56.754227 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Feb 13 16:01:56.754233 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Feb 13 16:01:56.754238 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Feb 13 16:01:56.754244 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Feb 13 16:01:56.754249 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 16:01:56.754255 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Feb 13 16:01:56.754261 kernel: TSC deadline timer available Feb 13 16:01:56.754267 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Feb 13 16:01:56.754273 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Feb 13 16:01:56.754278 kernel: Booting paravirtualized kernel on VMware hypervisor Feb 13 16:01:56.754284 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 16:01:56.754289 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Feb 13 16:01:56.754295 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Feb 13 16:01:56.754301 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Feb 13 16:01:56.754306 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Feb 13 16:01:56.754313 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Feb 13 16:01:56.754318 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Feb 13 16:01:56.754324 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Feb 13 16:01:56.754329 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Feb 13 16:01:56.754342 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Feb 13 16:01:56.754349 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Feb 13 16:01:56.754355 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Feb 13 16:01:56.754360 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Feb 13 16:01:56.754366 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Feb 13 16:01:56.754373 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Feb 13 16:01:56.754378 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Feb 13 16:01:56.754384 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Feb 13 16:01:56.754390 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Feb 13 16:01:56.754396 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Feb 13 16:01:56.754401 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Feb 13 16:01:56.754408 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 16:01:56.754414 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 16:01:56.754421 kernel: random: crng init done Feb 13 16:01:56.754427 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Feb 13 16:01:56.754433 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Feb 13 16:01:56.754438 kernel: printk: log_buf_len min size: 262144 bytes Feb 13 16:01:56.754450 kernel: printk: log_buf_len: 1048576 bytes Feb 13 16:01:56.754456 kernel: printk: early log buf free: 239648(91%) Feb 13 16:01:56.754461 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 16:01:56.754467 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 16:01:56.754473 kernel: Fallback order for Node 0: 0 Feb 13 16:01:56.754480 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Feb 13 16:01:56.754486 kernel: Policy zone: DMA32 Feb 13 16:01:56.754492 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 16:01:56.754498 kernel: Memory: 1934316K/2096628K available (14336K kernel code, 2301K rwdata, 22852K rodata, 43476K init, 1596K bss, 162052K reserved, 0K cma-reserved) Feb 13 16:01:56.754505 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Feb 13 16:01:56.754511 kernel: ftrace: allocating 37893 entries in 149 pages Feb 13 16:01:56.754517 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 16:01:56.754523 kernel: Dynamic Preempt: voluntary Feb 13 16:01:56.754529 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 16:01:56.754535 kernel: rcu: RCU event tracing is enabled. Feb 13 16:01:56.754541 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Feb 13 16:01:56.754547 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 16:01:56.754553 kernel: Rude variant of Tasks RCU enabled. Feb 13 16:01:56.754559 kernel: Tracing variant of Tasks RCU enabled. Feb 13 16:01:56.754566 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 16:01:56.755466 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Feb 13 16:01:56.755474 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Feb 13 16:01:56.755480 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Feb 13 16:01:56.755486 kernel: Console: colour VGA+ 80x25 Feb 13 16:01:56.755492 kernel: printk: console [tty0] enabled Feb 13 16:01:56.755498 kernel: printk: console [ttyS0] enabled Feb 13 16:01:56.755504 kernel: ACPI: Core revision 20230628 Feb 13 16:01:56.755510 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Feb 13 16:01:56.755516 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 16:01:56.755524 kernel: x2apic enabled Feb 13 16:01:56.755530 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 16:01:56.755535 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 16:01:56.755541 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 16:01:56.755547 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Feb 13 16:01:56.755553 kernel: Disabled fast string operations Feb 13 16:01:56.755559 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 16:01:56.755565 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 16:01:56.755571 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 16:01:56.755578 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Feb 13 16:01:56.755584 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Feb 13 16:01:56.755590 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Feb 13 16:01:56.755596 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 16:01:56.755601 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 16:01:56.755607 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 16:01:56.755613 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 16:01:56.755619 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 16:01:56.755625 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 16:01:56.755632 kernel: SRBDS: Unknown: Dependent on hypervisor status Feb 13 16:01:56.755639 kernel: GDS: Unknown: Dependent on hypervisor status Feb 13 16:01:56.755645 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 16:01:56.755650 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 16:01:56.755656 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 16:01:56.755662 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 16:01:56.755668 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Feb 13 16:01:56.755674 kernel: Freeing SMP alternatives memory: 32K Feb 13 16:01:56.755681 kernel: pid_max: default: 131072 minimum: 1024 Feb 13 16:01:56.755687 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 16:01:56.755692 kernel: landlock: Up and running. Feb 13 16:01:56.755698 kernel: SELinux: Initializing. Feb 13 16:01:56.755704 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 16:01:56.755710 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 16:01:56.755716 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 16:01:56.755722 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 16:01:56.755728 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 16:01:56.755734 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 16:01:56.755740 kernel: Performance Events: Skylake events, core PMU driver. Feb 13 16:01:56.755746 kernel: core: CPUID marked event: 'cpu cycles' unavailable Feb 13 16:01:56.755752 kernel: core: CPUID marked event: 'instructions' unavailable Feb 13 16:01:56.755758 kernel: core: CPUID marked event: 'bus cycles' unavailable Feb 13 16:01:56.755764 kernel: core: CPUID marked event: 'cache references' unavailable Feb 13 16:01:56.755769 kernel: core: CPUID marked event: 'cache misses' unavailable Feb 13 16:01:56.755775 kernel: core: CPUID marked event: 'branch instructions' unavailable Feb 13 16:01:56.755781 kernel: core: CPUID marked event: 'branch misses' unavailable Feb 13 16:01:56.755787 kernel: ... version: 1 Feb 13 16:01:56.755793 kernel: ... bit width: 48 Feb 13 16:01:56.755799 kernel: ... generic registers: 4 Feb 13 16:01:56.755805 kernel: ... value mask: 0000ffffffffffff Feb 13 16:01:56.755811 kernel: ... max period: 000000007fffffff Feb 13 16:01:56.755817 kernel: ... fixed-purpose events: 0 Feb 13 16:01:56.755822 kernel: ... event mask: 000000000000000f Feb 13 16:01:56.755828 kernel: signal: max sigframe size: 1776 Feb 13 16:01:56.755834 kernel: rcu: Hierarchical SRCU implementation. Feb 13 16:01:56.755841 kernel: rcu: Max phase no-delay instances is 400. Feb 13 16:01:56.755847 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 16:01:56.755853 kernel: smp: Bringing up secondary CPUs ... Feb 13 16:01:56.755859 kernel: smpboot: x86: Booting SMP configuration: Feb 13 16:01:56.755865 kernel: .... node #0, CPUs: #1 Feb 13 16:01:56.755870 kernel: Disabled fast string operations Feb 13 16:01:56.755876 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Feb 13 16:01:56.755882 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Feb 13 16:01:56.755888 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 16:01:56.755894 kernel: smpboot: Max logical packages: 128 Feb 13 16:01:56.755900 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Feb 13 16:01:56.755906 kernel: devtmpfs: initialized Feb 13 16:01:56.755912 kernel: x86/mm: Memory block size: 128MB Feb 13 16:01:56.755918 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Feb 13 16:01:56.755924 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 16:01:56.755930 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Feb 13 16:01:56.755936 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 16:01:56.755941 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 16:01:56.755947 kernel: audit: initializing netlink subsys (disabled) Feb 13 16:01:56.755954 kernel: audit: type=2000 audit(1739462514.067:1): state=initialized audit_enabled=0 res=1 Feb 13 16:01:56.755960 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 16:01:56.755966 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 16:01:56.755971 kernel: cpuidle: using governor menu Feb 13 16:01:56.755977 kernel: Simple Boot Flag at 0x36 set to 0x80 Feb 13 16:01:56.755983 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 16:01:56.755990 kernel: dca service started, version 1.12.1 Feb 13 16:01:56.755996 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Feb 13 16:01:56.756001 kernel: PCI: Using configuration type 1 for base access Feb 13 16:01:56.756008 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 16:01:56.756014 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 16:01:56.756020 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 16:01:56.756026 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 16:01:56.756031 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 16:01:56.756037 kernel: ACPI: Added _OSI(Module Device) Feb 13 16:01:56.756043 kernel: ACPI: Added _OSI(Processor Device) Feb 13 16:01:56.756049 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 16:01:56.756055 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 16:01:56.756062 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 16:01:56.756067 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Feb 13 16:01:56.756073 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 16:01:56.756079 kernel: ACPI: Interpreter enabled Feb 13 16:01:56.756085 kernel: ACPI: PM: (supports S0 S1 S5) Feb 13 16:01:56.756091 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 16:01:56.756096 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 16:01:56.756102 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 16:01:56.756109 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Feb 13 16:01:56.756115 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Feb 13 16:01:56.756192 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 16:01:56.756246 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Feb 13 16:01:56.756294 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Feb 13 16:01:56.756303 kernel: PCI host bridge to bus 0000:00 Feb 13 16:01:56.756352 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 16:01:56.756400 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Feb 13 16:01:56.757222 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 13 16:01:56.757280 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 16:01:56.757342 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Feb 13 16:01:56.757386 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Feb 13 16:01:56.757484 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Feb 13 16:01:56.757549 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Feb 13 16:01:56.757608 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Feb 13 16:01:56.757663 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Feb 13 16:01:56.757714 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Feb 13 16:01:56.757766 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 16:01:56.757816 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 16:01:56.757865 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 16:01:56.757918 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 16:01:56.757972 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Feb 13 16:01:56.758023 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Feb 13 16:01:56.758073 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Feb 13 16:01:56.758128 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Feb 13 16:01:56.758179 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Feb 13 16:01:56.758232 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Feb 13 16:01:56.758286 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Feb 13 16:01:56.758336 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Feb 13 16:01:56.758386 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Feb 13 16:01:56.758434 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Feb 13 16:01:56.760608 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Feb 13 16:01:56.760664 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 16:01:56.760726 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Feb 13 16:01:56.760781 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.760833 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.760887 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.760938 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.760990 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761042 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761097 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761189 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761243 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761292 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761346 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761395 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761458 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761510 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761566 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761618 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761673 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761724 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761782 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761843 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.761900 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.761951 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762004 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762058 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762112 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762198 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762252 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762302 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762355 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762408 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762474 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762527 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762583 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762634 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762687 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762739 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762795 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762845 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.762898 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.762949 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.763002 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.763053 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.763109 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.763179 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.763248 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.763299 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.763353 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.763403 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.765478 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.765535 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.765590 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.765641 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.765694 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.765746 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.765802 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.765853 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.765907 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.765959 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.766012 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.766064 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.766119 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.766173 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.766227 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Feb 13 16:01:56.766278 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.766332 kernel: pci_bus 0000:01: extended config space not accessible Feb 13 16:01:56.766384 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 16:01:56.766435 kernel: pci_bus 0000:02: extended config space not accessible Feb 13 16:01:56.766457 kernel: acpiphp: Slot [32] registered Feb 13 16:01:56.766464 kernel: acpiphp: Slot [33] registered Feb 13 16:01:56.766470 kernel: acpiphp: Slot [34] registered Feb 13 16:01:56.766476 kernel: acpiphp: Slot [35] registered Feb 13 16:01:56.766482 kernel: acpiphp: Slot [36] registered Feb 13 16:01:56.766487 kernel: acpiphp: Slot [37] registered Feb 13 16:01:56.766493 kernel: acpiphp: Slot [38] registered Feb 13 16:01:56.766498 kernel: acpiphp: Slot [39] registered Feb 13 16:01:56.766504 kernel: acpiphp: Slot [40] registered Feb 13 16:01:56.766512 kernel: acpiphp: Slot [41] registered Feb 13 16:01:56.766517 kernel: acpiphp: Slot [42] registered Feb 13 16:01:56.766523 kernel: acpiphp: Slot [43] registered Feb 13 16:01:56.766529 kernel: acpiphp: Slot [44] registered Feb 13 16:01:56.766550 kernel: acpiphp: Slot [45] registered Feb 13 16:01:56.766555 kernel: acpiphp: Slot [46] registered Feb 13 16:01:56.766561 kernel: acpiphp: Slot [47] registered Feb 13 16:01:56.766567 kernel: acpiphp: Slot [48] registered Feb 13 16:01:56.766572 kernel: acpiphp: Slot [49] registered Feb 13 16:01:56.766578 kernel: acpiphp: Slot [50] registered Feb 13 16:01:56.766584 kernel: acpiphp: Slot [51] registered Feb 13 16:01:56.766590 kernel: acpiphp: Slot [52] registered Feb 13 16:01:56.766595 kernel: acpiphp: Slot [53] registered Feb 13 16:01:56.766601 kernel: acpiphp: Slot [54] registered Feb 13 16:01:56.766606 kernel: acpiphp: Slot [55] registered Feb 13 16:01:56.766612 kernel: acpiphp: Slot [56] registered Feb 13 16:01:56.766617 kernel: acpiphp: Slot [57] registered Feb 13 16:01:56.766623 kernel: acpiphp: Slot [58] registered Feb 13 16:01:56.766628 kernel: acpiphp: Slot [59] registered Feb 13 16:01:56.766635 kernel: acpiphp: Slot [60] registered Feb 13 16:01:56.766641 kernel: acpiphp: Slot [61] registered Feb 13 16:01:56.766646 kernel: acpiphp: Slot [62] registered Feb 13 16:01:56.766652 kernel: acpiphp: Slot [63] registered Feb 13 16:01:56.766706 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Feb 13 16:01:56.766757 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 16:01:56.767570 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 16:01:56.767620 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 16:01:56.767669 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Feb 13 16:01:56.767723 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Feb 13 16:01:56.767771 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Feb 13 16:01:56.767820 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Feb 13 16:01:56.767867 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Feb 13 16:01:56.767924 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Feb 13 16:01:56.767974 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Feb 13 16:01:56.768027 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Feb 13 16:01:56.768078 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 16:01:56.768146 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 16:01:56.768212 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 16:01:56.768262 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 16:01:56.768312 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 16:01:56.768393 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 16:01:56.769608 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 16:01:56.769673 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 16:01:56.769726 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 16:01:56.769778 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 16:01:56.769829 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 16:01:56.769879 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 16:01:56.769928 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 16:01:56.769978 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 16:01:56.770027 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 16:01:56.770079 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 16:01:56.770132 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 16:01:56.770183 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 16:01:56.770250 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 16:01:56.770316 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 16:01:56.770403 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 16:01:56.770459 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 16:01:56.770509 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 16:01:56.770559 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 16:01:56.770609 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 16:01:56.770658 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 16:01:56.770708 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 16:01:56.770760 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 16:01:56.770809 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 16:01:56.770865 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Feb 13 16:01:56.770917 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Feb 13 16:01:56.770968 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Feb 13 16:01:56.771019 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Feb 13 16:01:56.771069 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Feb 13 16:01:56.771125 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 16:01:56.771236 kernel: pci 0000:0b:00.0: supports D1 D2 Feb 13 16:01:56.771287 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 16:01:56.771337 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 16:01:56.771387 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 16:01:56.771436 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 16:01:56.774725 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 16:01:56.774829 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 16:01:56.774901 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 16:01:56.774954 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 16:01:56.775007 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 16:01:56.775061 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 16:01:56.775113 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 16:01:56.775165 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 16:01:56.775217 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 16:01:56.775270 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 16:01:56.775325 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 16:01:56.775376 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 16:01:56.775429 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 16:01:56.775500 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 16:01:56.775553 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 16:01:56.775605 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 16:01:56.775656 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 16:01:56.775707 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 16:01:56.775763 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 16:01:56.775816 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 16:01:56.775867 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 16:01:56.775920 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 16:01:56.775972 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 16:01:56.776022 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 16:01:56.776076 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 16:01:56.776128 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 16:01:56.776181 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 16:01:56.776233 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 16:01:56.776301 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 16:01:56.776352 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 16:01:56.776420 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 16:01:56.776575 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 16:01:56.776644 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 16:01:56.776696 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 16:01:56.776745 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 16:01:56.776793 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 16:01:56.776843 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 16:01:56.776892 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 16:01:56.776942 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 16:01:56.776992 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 16:01:56.777041 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 16:01:56.777093 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 16:01:56.777147 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 16:01:56.777197 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 16:01:56.777246 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 16:01:56.777297 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 16:01:56.777346 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 16:01:56.777395 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 16:01:56.777451 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 16:01:56.777505 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 16:01:56.777555 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 16:01:56.777604 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 16:01:56.777675 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 16:01:56.777735 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 16:01:56.777787 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 16:01:56.777837 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 16:01:56.777886 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 16:01:56.777938 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 16:01:56.777987 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 16:01:56.778037 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 16:01:56.778086 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 16:01:56.778134 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 16:01:56.778184 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 16:01:56.778233 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 16:01:56.778284 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 16:01:56.778334 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 16:01:56.778383 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 16:01:56.778432 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 16:01:56.778526 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 16:01:56.778575 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 16:01:56.778624 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 16:01:56.778673 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 16:01:56.778725 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 16:01:56.778775 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 16:01:56.778824 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 16:01:56.778872 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 16:01:56.778921 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 16:01:56.778929 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Feb 13 16:01:56.778935 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Feb 13 16:01:56.778941 kernel: ACPI: PCI: Interrupt link LNKB disabled Feb 13 16:01:56.778947 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 16:01:56.778955 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Feb 13 16:01:56.778961 kernel: iommu: Default domain type: Translated Feb 13 16:01:56.778966 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 16:01:56.778972 kernel: PCI: Using ACPI for IRQ routing Feb 13 16:01:56.778977 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 16:01:56.778983 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Feb 13 16:01:56.778989 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Feb 13 16:01:56.779036 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Feb 13 16:01:56.779084 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Feb 13 16:01:56.779136 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 16:01:56.779144 kernel: vgaarb: loaded Feb 13 16:01:56.779150 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Feb 13 16:01:56.779156 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Feb 13 16:01:56.779162 kernel: clocksource: Switched to clocksource tsc-early Feb 13 16:01:56.779167 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 16:01:56.779173 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 16:01:56.779179 kernel: pnp: PnP ACPI init Feb 13 16:01:56.779229 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Feb 13 16:01:56.779278 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Feb 13 16:01:56.779323 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Feb 13 16:01:56.779371 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Feb 13 16:01:56.779418 kernel: pnp 00:06: [dma 2] Feb 13 16:01:56.779484 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Feb 13 16:01:56.779532 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Feb 13 16:01:56.779580 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Feb 13 16:01:56.779588 kernel: pnp: PnP ACPI: found 8 devices Feb 13 16:01:56.779594 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 16:01:56.779600 kernel: NET: Registered PF_INET protocol family Feb 13 16:01:56.779605 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 16:01:56.779612 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 16:01:56.779617 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 16:01:56.779623 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 16:01:56.779631 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 16:01:56.779637 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 16:01:56.779642 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 16:01:56.779648 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 16:01:56.779654 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 16:01:56.779659 kernel: NET: Registered PF_XDP protocol family Feb 13 16:01:56.779709 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Feb 13 16:01:56.779761 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Feb 13 16:01:56.779814 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Feb 13 16:01:56.779864 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Feb 13 16:01:56.779914 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Feb 13 16:01:56.780208 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Feb 13 16:01:56.780261 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Feb 13 16:01:56.780312 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Feb 13 16:01:56.780366 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Feb 13 16:01:56.780417 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Feb 13 16:01:56.780480 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Feb 13 16:01:56.780533 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Feb 13 16:01:56.780583 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Feb 13 16:01:56.780633 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Feb 13 16:01:56.780686 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Feb 13 16:01:56.780735 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Feb 13 16:01:56.780784 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Feb 13 16:01:56.780834 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Feb 13 16:01:56.780884 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Feb 13 16:01:56.780933 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Feb 13 16:01:56.781004 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Feb 13 16:01:56.781056 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Feb 13 16:01:56.781105 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Feb 13 16:01:56.781202 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 16:01:56.781254 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 16:01:56.781304 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.781357 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.781406 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.781463 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.781514 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.781563 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.781612 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.781661 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.781710 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.781762 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.781811 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.781860 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.781908 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.781957 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782006 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782054 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782104 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782155 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782205 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782254 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782321 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782384 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782433 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782561 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782611 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782663 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782712 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782761 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782810 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782858 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.782907 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.782956 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783004 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783055 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783103 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783189 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783251 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783301 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783351 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783401 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783456 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783509 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783557 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783606 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783656 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783705 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783755 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783804 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783853 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.783901 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.783950 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784001 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784050 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784099 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784152 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784202 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784251 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784300 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784348 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784397 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784480 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784533 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784583 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784632 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784682 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784731 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784781 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784830 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784879 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.784927 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.784979 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.785029 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.785077 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.785126 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.785174 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.785223 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.785272 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.785321 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.785369 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.785421 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.785482 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.785532 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.785582 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 16:01:56.785631 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 16:01:56.785681 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 16:01:56.785731 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Feb 13 16:01:56.785780 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 16:01:56.785828 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 16:01:56.785879 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 16:01:56.785932 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Feb 13 16:01:56.785982 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 16:01:56.786031 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 16:01:56.786080 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 16:01:56.786129 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 16:01:56.786178 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 16:01:56.786228 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 16:01:56.786280 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 16:01:56.786328 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 16:01:56.786378 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 16:01:56.786427 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 16:01:56.786521 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 16:01:56.786571 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 16:01:56.786619 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 16:01:56.786686 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 16:01:56.786768 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 16:01:56.786853 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 16:01:56.786920 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 16:01:56.786968 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 16:01:56.787019 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 16:01:56.787067 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 16:01:56.787115 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 16:01:56.787169 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 16:01:56.787220 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 16:01:56.787269 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 16:01:56.787318 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 16:01:56.787367 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 16:01:56.787415 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 16:01:56.787474 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Feb 13 16:01:56.787524 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 16:01:56.787573 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 16:01:56.787622 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 16:01:56.787673 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 16:01:56.787723 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 16:01:56.787773 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 16:01:56.787822 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 16:01:56.787872 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 16:01:56.787921 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 16:01:56.787970 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 16:01:56.788020 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 16:01:56.788069 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 16:01:56.788121 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 16:01:56.788169 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 16:01:56.788217 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 16:01:56.788266 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 16:01:56.788315 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 16:01:56.788363 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 16:01:56.788412 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 16:01:56.788468 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 16:01:56.788516 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 16:01:56.788565 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 16:01:56.788617 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 16:01:56.788666 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 16:01:56.788714 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 16:01:56.788763 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 16:01:56.788811 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 16:01:56.788861 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 16:01:56.788910 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 16:01:56.788959 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 16:01:56.789008 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 16:01:56.789060 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 16:01:56.789109 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 16:01:56.789158 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 16:01:56.789207 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 16:01:56.789256 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 16:01:56.789304 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 16:01:56.789353 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 16:01:56.789401 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 16:01:56.789457 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 16:01:56.789507 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 16:01:56.789559 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 16:01:56.789609 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 16:01:56.789658 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 16:01:56.789708 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 16:01:56.789757 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 16:01:56.789806 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 16:01:56.789855 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 16:01:56.789904 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 16:01:56.789954 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 16:01:56.790007 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 16:01:56.790056 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 16:01:56.790106 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 16:01:56.790158 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 16:01:56.790207 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 16:01:56.790256 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 16:01:56.790305 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 16:01:56.790354 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 16:01:56.790403 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 16:01:56.790496 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 16:01:56.790551 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 16:01:56.790599 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 16:01:56.790648 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 16:01:56.790696 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 16:01:56.790743 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 16:01:56.790792 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 16:01:56.790840 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 16:01:56.790889 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 16:01:56.790938 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 16:01:56.790989 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 16:01:56.791038 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 16:01:56.791086 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 16:01:56.791164 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 16:01:56.791245 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 16:01:56.791294 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 16:01:56.791343 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 16:01:56.791392 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 16:01:56.791490 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 16:01:56.791542 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 16:01:56.791595 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 16:01:56.791641 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 16:01:56.791684 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 16:01:56.791727 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 16:01:56.791769 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Feb 13 16:01:56.791811 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Feb 13 16:01:56.791859 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Feb 13 16:01:56.791907 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Feb 13 16:01:56.791951 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 16:01:56.791995 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 16:01:56.792040 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 16:01:56.792084 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 16:01:56.792128 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Feb 13 16:01:56.792172 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Feb 13 16:01:56.792221 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Feb 13 16:01:56.792269 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Feb 13 16:01:56.792314 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 16:01:56.792363 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Feb 13 16:01:56.792409 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Feb 13 16:01:56.793510 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 16:01:56.793569 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Feb 13 16:01:56.793617 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Feb 13 16:01:56.793665 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 16:01:56.793713 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Feb 13 16:01:56.793758 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 16:01:56.793806 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Feb 13 16:01:56.793850 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 16:01:56.793900 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Feb 13 16:01:56.793948 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 16:01:56.793995 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Feb 13 16:01:56.794040 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 16:01:56.794095 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Feb 13 16:01:56.794152 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 16:01:56.794203 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Feb 13 16:01:56.794252 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Feb 13 16:01:56.794297 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 16:01:56.794346 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Feb 13 16:01:56.794391 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Feb 13 16:01:56.794437 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 16:01:56.794493 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Feb 13 16:01:56.794541 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Feb 13 16:01:56.794586 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 16:01:56.794634 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Feb 13 16:01:56.794680 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 16:01:56.794729 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Feb 13 16:01:56.794774 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 16:01:56.794824 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Feb 13 16:01:56.794872 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 16:01:56.794920 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Feb 13 16:01:56.794965 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 16:01:56.795016 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Feb 13 16:01:56.795062 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 16:01:56.795110 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Feb 13 16:01:56.795192 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Feb 13 16:01:56.795237 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 16:01:56.795286 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Feb 13 16:01:56.795330 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Feb 13 16:01:56.795375 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 16:01:56.795424 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Feb 13 16:01:56.796665 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Feb 13 16:01:56.796720 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 16:01:56.796810 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Feb 13 16:01:56.796885 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 16:01:56.796934 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Feb 13 16:01:56.796980 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 16:01:56.797029 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Feb 13 16:01:56.797076 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 16:01:56.797128 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Feb 13 16:01:56.797210 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 16:01:56.797260 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Feb 13 16:01:56.797306 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 16:01:56.797357 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Feb 13 16:01:56.797404 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Feb 13 16:01:56.797524 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 16:01:56.797575 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Feb 13 16:01:56.797620 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Feb 13 16:01:56.797676 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 16:01:56.797726 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Feb 13 16:01:56.797774 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 16:01:56.797823 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Feb 13 16:01:56.797870 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 16:01:56.797919 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Feb 13 16:01:56.797965 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 16:01:56.798015 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Feb 13 16:01:56.798064 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 16:01:56.798114 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Feb 13 16:01:56.798212 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 16:01:56.798262 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Feb 13 16:01:56.798308 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 16:01:56.798363 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 16:01:56.798373 kernel: PCI: CLS 32 bytes, default 64 Feb 13 16:01:56.798381 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 16:01:56.798388 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 16:01:56.798394 kernel: clocksource: Switched to clocksource tsc Feb 13 16:01:56.798400 kernel: Initialise system trusted keyrings Feb 13 16:01:56.798406 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 16:01:56.798413 kernel: Key type asymmetric registered Feb 13 16:01:56.798418 kernel: Asymmetric key parser 'x509' registered Feb 13 16:01:56.798424 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 16:01:56.798430 kernel: io scheduler mq-deadline registered Feb 13 16:01:56.798438 kernel: io scheduler kyber registered Feb 13 16:01:56.798452 kernel: io scheduler bfq registered Feb 13 16:01:56.798510 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Feb 13 16:01:56.798563 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.798615 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Feb 13 16:01:56.798665 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.798716 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Feb 13 16:01:56.798770 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.798823 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Feb 13 16:01:56.798873 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.798924 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Feb 13 16:01:56.798975 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799026 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Feb 13 16:01:56.799080 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799131 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Feb 13 16:01:56.799181 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799232 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Feb 13 16:01:56.799282 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799332 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Feb 13 16:01:56.799385 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799435 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Feb 13 16:01:56.799500 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799551 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Feb 13 16:01:56.799601 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799651 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Feb 13 16:01:56.799701 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799754 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Feb 13 16:01:56.799805 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799854 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Feb 13 16:01:56.799904 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.799953 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Feb 13 16:01:56.800006 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.800056 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Feb 13 16:01:56.800106 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.800160 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Feb 13 16:01:56.800211 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.800261 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Feb 13 16:01:56.800313 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.800363 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Feb 13 16:01:56.800413 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.802392 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Feb 13 16:01:56.802464 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.802523 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Feb 13 16:01:56.802580 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.802633 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Feb 13 16:01:56.802684 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.802735 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Feb 13 16:01:56.802787 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.802842 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Feb 13 16:01:56.802894 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.802944 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Feb 13 16:01:56.802995 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803046 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Feb 13 16:01:56.803098 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803184 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Feb 13 16:01:56.803238 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803290 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Feb 13 16:01:56.803341 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803391 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Feb 13 16:01:56.803447 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803527 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Feb 13 16:01:56.803595 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803648 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Feb 13 16:01:56.803698 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803749 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Feb 13 16:01:56.803804 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 16:01:56.803813 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 16:01:56.803820 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 16:01:56.803826 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 16:01:56.803832 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Feb 13 16:01:56.803839 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 16:01:56.803845 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 16:01:56.803894 kernel: rtc_cmos 00:01: registered as rtc0 Feb 13 16:01:56.803945 kernel: rtc_cmos 00:01: setting system clock to 2025-02-13T16:01:56 UTC (1739462516) Feb 13 16:01:56.803954 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 16:01:56.803998 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Feb 13 16:01:56.804006 kernel: intel_pstate: CPU model not supported Feb 13 16:01:56.804013 kernel: NET: Registered PF_INET6 protocol family Feb 13 16:01:56.804019 kernel: Segment Routing with IPv6 Feb 13 16:01:56.804025 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 16:01:56.804032 kernel: NET: Registered PF_PACKET protocol family Feb 13 16:01:56.804038 kernel: Key type dns_resolver registered Feb 13 16:01:56.804046 kernel: IPI shorthand broadcast: enabled Feb 13 16:01:56.804052 kernel: sched_clock: Marking stable (904278475, 228669760)->(1192532627, -59584392) Feb 13 16:01:56.804058 kernel: registered taskstats version 1 Feb 13 16:01:56.804064 kernel: Loading compiled-in X.509 certificates Feb 13 16:01:56.804070 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: a260c8876205efb4ca2ab3eb040cd310ec7afd21' Feb 13 16:01:56.804077 kernel: Key type .fscrypt registered Feb 13 16:01:56.804083 kernel: Key type fscrypt-provisioning registered Feb 13 16:01:56.804089 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 16:01:56.804096 kernel: ima: Allocated hash algorithm: sha1 Feb 13 16:01:56.804102 kernel: ima: No architecture policies found Feb 13 16:01:56.804108 kernel: clk: Disabling unused clocks Feb 13 16:01:56.804115 kernel: Freeing unused kernel image (initmem) memory: 43476K Feb 13 16:01:56.804124 kernel: Write protecting the kernel read-only data: 38912k Feb 13 16:01:56.804149 kernel: Freeing unused kernel image (rodata/data gap) memory: 1724K Feb 13 16:01:56.804155 kernel: Run /init as init process Feb 13 16:01:56.804161 kernel: with arguments: Feb 13 16:01:56.804167 kernel: /init Feb 13 16:01:56.804190 kernel: with environment: Feb 13 16:01:56.804196 kernel: HOME=/ Feb 13 16:01:56.804202 kernel: TERM=linux Feb 13 16:01:56.804208 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 16:01:56.804240 systemd[1]: Successfully made /usr/ read-only. Feb 13 16:01:56.804250 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 16:01:56.804257 systemd[1]: Detected virtualization vmware. Feb 13 16:01:56.804263 systemd[1]: Detected architecture x86-64. Feb 13 16:01:56.804271 systemd[1]: Running in initrd. Feb 13 16:01:56.804278 systemd[1]: No hostname configured, using default hostname. Feb 13 16:01:56.804284 systemd[1]: Hostname set to . Feb 13 16:01:56.804290 systemd[1]: Initializing machine ID from random generator. Feb 13 16:01:56.804297 systemd[1]: Queued start job for default target initrd.target. Feb 13 16:01:56.804303 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 16:01:56.804309 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 16:01:56.804316 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 16:01:56.804324 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 16:01:56.804331 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 16:01:56.804338 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 16:01:56.804345 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 16:01:56.804352 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 16:01:56.804358 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 16:01:56.804365 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 16:01:56.804372 systemd[1]: Reached target paths.target - Path Units. Feb 13 16:01:56.804379 systemd[1]: Reached target slices.target - Slice Units. Feb 13 16:01:56.804385 systemd[1]: Reached target swap.target - Swaps. Feb 13 16:01:56.804391 systemd[1]: Reached target timers.target - Timer Units. Feb 13 16:01:56.804398 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 16:01:56.804404 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 16:01:56.804410 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 16:01:56.804417 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Feb 13 16:01:56.804423 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 16:01:56.804431 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 16:01:56.804437 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 16:01:56.804474 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 16:01:56.804482 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 16:01:56.804489 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 16:01:56.804495 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 16:01:56.804501 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 16:01:56.804508 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 16:01:56.804516 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 16:01:56.804523 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:01:56.804529 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 16:01:56.804551 systemd-journald[215]: Collecting audit messages is disabled. Feb 13 16:01:56.804569 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 16:01:56.804576 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 16:01:56.804583 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 16:01:56.804589 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 16:01:56.804597 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 16:01:56.804603 kernel: Bridge firewalling registered Feb 13 16:01:56.804610 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 16:01:56.804616 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:01:56.804622 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 16:01:56.804629 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 16:01:56.804636 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 16:01:56.804643 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:01:56.804650 systemd-journald[215]: Journal started Feb 13 16:01:56.804668 systemd-journald[215]: Runtime Journal (/run/log/journal/91dbf0b496994aef903d36a7a27048a2) is 4.8M, max 38.6M, 33.8M free. Feb 13 16:01:56.753448 systemd-modules-load[216]: Inserted module 'overlay' Feb 13 16:01:56.777741 systemd-modules-load[216]: Inserted module 'br_netfilter' Feb 13 16:01:56.809672 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 16:01:56.809689 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 16:01:56.809904 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 16:01:56.811396 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 16:01:56.812526 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 16:01:56.818341 dracut-cmdline[239]: dracut-dracut-053 Feb 13 16:01:56.818577 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 16:01:56.819602 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 16:01:56.821710 dracut-cmdline[239]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 16:01:56.847089 systemd-resolved[261]: Positive Trust Anchors: Feb 13 16:01:56.847097 systemd-resolved[261]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 16:01:56.847118 systemd-resolved[261]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 16:01:56.849687 systemd-resolved[261]: Defaulting to hostname 'linux'. Feb 13 16:01:56.850274 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 16:01:56.850504 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 16:01:56.869457 kernel: SCSI subsystem initialized Feb 13 16:01:56.875454 kernel: Loading iSCSI transport class v2.0-870. Feb 13 16:01:56.881457 kernel: iscsi: registered transport (tcp) Feb 13 16:01:56.894454 kernel: iscsi: registered transport (qla4xxx) Feb 13 16:01:56.894471 kernel: QLogic iSCSI HBA Driver Feb 13 16:01:56.912651 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 16:01:56.917514 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 16:01:56.932155 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 16:01:56.932176 kernel: device-mapper: uevent: version 1.0.3 Feb 13 16:01:56.932185 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 16:01:56.962483 kernel: raid6: avx2x4 gen() 48046 MB/s Feb 13 16:01:56.979483 kernel: raid6: avx2x2 gen() 55048 MB/s Feb 13 16:01:56.996599 kernel: raid6: avx2x1 gen() 46628 MB/s Feb 13 16:01:56.996616 kernel: raid6: using algorithm avx2x2 gen() 55048 MB/s Feb 13 16:01:57.015469 kernel: raid6: .... xor() 33426 MB/s, rmw enabled Feb 13 16:01:57.015490 kernel: raid6: using avx2x2 recovery algorithm Feb 13 16:01:57.027451 kernel: xor: automatically using best checksumming function avx Feb 13 16:01:57.114460 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 16:01:57.119824 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 16:01:57.125514 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 16:01:57.133768 systemd-udevd[435]: Using default interface naming scheme 'v255'. Feb 13 16:01:57.136575 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 16:01:57.143531 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 16:01:57.149754 dracut-pre-trigger[443]: rd.md=0: removing MD RAID activation Feb 13 16:01:57.163961 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 16:01:57.167585 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 16:01:57.236643 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 16:01:57.240557 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 16:01:57.249483 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 16:01:57.250170 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 16:01:57.250896 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 16:01:57.251248 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 16:01:57.255567 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 16:01:57.260871 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 16:01:57.309632 kernel: VMware PVSCSI driver - version 1.0.7.0-k Feb 13 16:01:57.309665 kernel: vmw_pvscsi: using 64bit dma Feb 13 16:01:57.309674 kernel: vmw_pvscsi: max_id: 16 Feb 13 16:01:57.311222 kernel: vmw_pvscsi: setting ring_pages to 8 Feb 13 16:01:57.314173 kernel: vmw_pvscsi: enabling reqCallThreshold Feb 13 16:01:57.314190 kernel: vmw_pvscsi: driver-based request coalescing enabled Feb 13 16:01:57.314202 kernel: vmw_pvscsi: using MSI-X Feb 13 16:01:57.314210 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Feb 13 16:01:57.321047 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Feb 13 16:01:57.321064 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Feb 13 16:01:57.321595 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Feb 13 16:01:57.329793 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Feb 13 16:01:57.329892 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Feb 13 16:01:57.329982 kernel: libata version 3.00 loaded. Feb 13 16:01:57.332456 kernel: ata_piix 0000:00:07.1: version 2.13 Feb 13 16:01:57.339635 kernel: scsi host1: ata_piix Feb 13 16:01:57.339938 kernel: scsi host2: ata_piix Feb 13 16:01:57.340925 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Feb 13 16:01:57.340944 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Feb 13 16:01:57.340958 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 16:01:57.349203 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 16:01:57.349283 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:01:57.349634 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Feb 13 16:01:57.349637 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 16:01:57.349732 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 16:01:57.349805 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:01:57.350051 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:01:57.355648 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:01:57.366220 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:01:57.366857 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 16:01:57.378625 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:01:57.507468 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Feb 13 16:01:57.516224 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Feb 13 16:01:57.529624 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Feb 13 16:01:57.581655 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 16:01:57.581752 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Feb 13 16:01:57.581817 kernel: sd 0:0:0:0: [sda] Cache data unavailable Feb 13 16:01:57.581879 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Feb 13 16:01:57.581939 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 16:01:57.581949 kernel: AES CTR mode by8 optimization enabled Feb 13 16:01:57.581957 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 16:01:57.581964 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 16:01:57.598460 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Feb 13 16:01:57.613598 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 13 16:01:57.613614 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Feb 13 16:01:57.728658 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (481) Feb 13 16:01:57.730450 kernel: BTRFS: device fsid 506754f7-5ef1-4c63-ad2a-b7b855a48f85 devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (491) Feb 13 16:01:57.738143 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Feb 13 16:01:57.743918 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Feb 13 16:01:57.749433 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 16:01:57.753898 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Feb 13 16:01:57.754111 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Feb 13 16:01:57.759511 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 16:01:57.784473 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 16:01:57.791462 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 16:01:58.791383 disk-uuid[590]: The operation has completed successfully. Feb 13 16:01:58.792284 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 16:01:58.829893 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 16:01:58.830185 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 16:01:58.852641 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 16:01:58.854705 sh[606]: Success Feb 13 16:01:58.864481 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 16:01:58.898694 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 16:01:58.901005 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 16:01:58.901230 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 16:01:58.914640 kernel: BTRFS info (device dm-0): first mount of filesystem 506754f7-5ef1-4c63-ad2a-b7b855a48f85 Feb 13 16:01:58.914660 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:01:58.914672 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 16:01:58.915739 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 16:01:58.917453 kernel: BTRFS info (device dm-0): using free space tree Feb 13 16:01:58.923455 kernel: BTRFS info (device dm-0): enabling ssd optimizations Feb 13 16:01:58.924800 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 16:01:58.932627 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Feb 13 16:01:58.934547 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 16:01:58.958979 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 16:01:58.959013 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:01:58.959028 kernel: BTRFS info (device sda6): using free space tree Feb 13 16:01:58.977459 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 16:01:58.982980 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 16:01:58.984467 kernel: BTRFS info (device sda6): last unmount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 16:01:58.987457 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 16:01:58.992539 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 16:01:59.004241 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 16:01:59.009523 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 16:01:59.064161 ignition[667]: Ignition 2.20.0 Feb 13 16:01:59.064168 ignition[667]: Stage: fetch-offline Feb 13 16:01:59.064188 ignition[667]: no configs at "/usr/lib/ignition/base.d" Feb 13 16:01:59.064192 ignition[667]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 16:01:59.064245 ignition[667]: parsed url from cmdline: "" Feb 13 16:01:59.064247 ignition[667]: no config URL provided Feb 13 16:01:59.064249 ignition[667]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 16:01:59.064254 ignition[667]: no config at "/usr/lib/ignition/user.ign" Feb 13 16:01:59.064617 ignition[667]: config successfully fetched Feb 13 16:01:59.064634 ignition[667]: parsing config with SHA512: e29368e2b1891d2c2fdee862935aea13eb9683aca898d5baa374e53f71a3b4bb1ac7b0adc07fc4a6bd8fe640be1d483efcade4387060cf90a995ce8a36e810ce Feb 13 16:01:59.067287 unknown[667]: fetched base config from "system" Feb 13 16:01:59.067293 unknown[667]: fetched user config from "vmware" Feb 13 16:01:59.067891 ignition[667]: fetch-offline: fetch-offline passed Feb 13 16:01:59.067939 ignition[667]: Ignition finished successfully Feb 13 16:01:59.068627 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 16:01:59.086521 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 16:01:59.090529 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 16:01:59.103687 systemd-networkd[801]: lo: Link UP Feb 13 16:01:59.103692 systemd-networkd[801]: lo: Gained carrier Feb 13 16:01:59.104608 systemd-networkd[801]: Enumeration completed Feb 13 16:01:59.104751 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 16:01:59.104884 systemd[1]: Reached target network.target - Network. Feb 13 16:01:59.104907 systemd-networkd[801]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Feb 13 16:01:59.104973 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 16:01:59.107964 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 16:01:59.108125 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 16:01:59.108561 systemd-networkd[801]: ens192: Link UP Feb 13 16:01:59.108567 systemd-networkd[801]: ens192: Gained carrier Feb 13 16:01:59.109924 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 16:01:59.117485 ignition[804]: Ignition 2.20.0 Feb 13 16:01:59.117495 ignition[804]: Stage: kargs Feb 13 16:01:59.117592 ignition[804]: no configs at "/usr/lib/ignition/base.d" Feb 13 16:01:59.117598 ignition[804]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 16:01:59.118085 ignition[804]: kargs: kargs passed Feb 13 16:01:59.118110 ignition[804]: Ignition finished successfully Feb 13 16:01:59.119111 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 16:01:59.120066 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 16:01:59.129094 ignition[811]: Ignition 2.20.0 Feb 13 16:01:59.129102 ignition[811]: Stage: disks Feb 13 16:01:59.129191 ignition[811]: no configs at "/usr/lib/ignition/base.d" Feb 13 16:01:59.129197 ignition[811]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 16:01:59.129696 ignition[811]: disks: disks passed Feb 13 16:01:59.129720 ignition[811]: Ignition finished successfully Feb 13 16:01:59.130319 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 16:01:59.130653 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 16:01:59.130774 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 16:01:59.131024 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 16:01:59.131240 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 16:01:59.131408 systemd[1]: Reached target basic.target - Basic System. Feb 13 16:01:59.138620 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 16:01:59.149100 systemd-fsck[820]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Feb 13 16:01:59.149865 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 16:01:59.917578 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 16:01:59.977460 kernel: EXT4-fs (sda9): mounted filesystem 8023eced-1511-4e72-a58a-db1b8cb3210e r/w with ordered data mode. Quota mode: none. Feb 13 16:01:59.978081 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 16:01:59.978566 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 16:01:59.982635 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 16:01:59.984500 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 16:01:59.984824 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 16:01:59.984853 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 16:01:59.984868 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 16:01:59.987569 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 16:01:59.988310 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 16:01:59.992477 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (828) Feb 13 16:01:59.995383 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 16:01:59.995400 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:01:59.995408 kernel: BTRFS info (device sda6): using free space tree Feb 13 16:02:00.000369 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 16:02:00.001197 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 16:02:00.019014 initrd-setup-root[852]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 16:02:00.021456 initrd-setup-root[859]: cut: /sysroot/etc/group: No such file or directory Feb 13 16:02:00.024192 initrd-setup-root[866]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 16:02:00.026746 initrd-setup-root[873]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 16:02:00.078940 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 16:02:00.083514 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 16:02:00.084664 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 16:02:00.090468 kernel: BTRFS info (device sda6): last unmount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 16:02:00.101707 ignition[940]: INFO : Ignition 2.20.0 Feb 13 16:02:00.101707 ignition[940]: INFO : Stage: mount Feb 13 16:02:00.102109 ignition[940]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 16:02:00.102109 ignition[940]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 16:02:00.102368 ignition[940]: INFO : mount: mount passed Feb 13 16:02:00.102368 ignition[940]: INFO : Ignition finished successfully Feb 13 16:02:00.103958 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 16:02:00.108511 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 16:02:00.108783 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 16:02:00.913321 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 16:02:00.918725 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 16:02:00.927831 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (952) Feb 13 16:02:00.929510 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 16:02:00.929533 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 16:02:00.931552 kernel: BTRFS info (device sda6): using free space tree Feb 13 16:02:00.934506 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 16:02:00.935793 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 16:02:00.950534 ignition[968]: INFO : Ignition 2.20.0 Feb 13 16:02:00.950534 ignition[968]: INFO : Stage: files Feb 13 16:02:00.950930 ignition[968]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 16:02:00.950930 ignition[968]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 16:02:00.951218 ignition[968]: DEBUG : files: compiled without relabeling support, skipping Feb 13 16:02:00.952106 ignition[968]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 16:02:00.952106 ignition[968]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 16:02:00.953414 ignition[968]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 16:02:00.953646 ignition[968]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 16:02:00.953983 unknown[968]: wrote ssh authorized keys file for user: core Feb 13 16:02:00.954244 ignition[968]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 16:02:00.955999 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Feb 13 16:02:00.955999 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Feb 13 16:02:00.997508 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 16:02:01.108820 systemd-networkd[801]: ens192: Gained IPv6LL Feb 13 16:02:01.442125 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Feb 13 16:02:01.442491 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Feb 13 16:02:01.442491 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 16:02:01.442491 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 16:02:01.442491 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 16:02:01.442491 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 16:02:01.442491 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 16:02:01.442491 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 16:02:01.444085 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 16:02:01.444085 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 16:02:01.444085 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 16:02:01.444085 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 16:02:01.444085 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 16:02:01.444085 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 16:02:01.444085 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 Feb 13 16:02:01.969593 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Feb 13 16:02:02.145859 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" Feb 13 16:02:02.145859 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 16:02:02.145859 ignition[968]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Feb 13 16:02:02.145859 ignition[968]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Feb 13 16:02:02.168488 ignition[968]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 16:02:02.170462 ignition[968]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 16:02:02.170462 ignition[968]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Feb 13 16:02:02.170462 ignition[968]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Feb 13 16:02:02.170462 ignition[968]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 16:02:02.172320 ignition[968]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 16:02:02.172320 ignition[968]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 16:02:02.172320 ignition[968]: INFO : files: files passed Feb 13 16:02:02.172320 ignition[968]: INFO : Ignition finished successfully Feb 13 16:02:02.171319 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 16:02:02.175607 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 16:02:02.177105 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 16:02:02.177683 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 16:02:02.177860 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 16:02:02.183082 initrd-setup-root-after-ignition[1000]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 16:02:02.183082 initrd-setup-root-after-ignition[1000]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 16:02:02.183451 initrd-setup-root-after-ignition[1004]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 16:02:02.184338 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 16:02:02.184636 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 16:02:02.191562 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 16:02:02.203716 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 16:02:02.203774 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 16:02:02.204059 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 16:02:02.204171 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 16:02:02.204365 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 16:02:02.204839 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 16:02:02.214146 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 16:02:02.218540 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 16:02:02.223928 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 16:02:02.224077 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 16:02:02.224325 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 16:02:02.224530 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 16:02:02.224595 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 16:02:02.224843 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 16:02:02.225083 systemd[1]: Stopped target basic.target - Basic System. Feb 13 16:02:02.225259 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 16:02:02.225468 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 16:02:02.225660 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 16:02:02.226003 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 16:02:02.226210 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 16:02:02.226424 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 16:02:02.226618 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 16:02:02.226810 systemd[1]: Stopped target swap.target - Swaps. Feb 13 16:02:02.226968 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 16:02:02.227030 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 16:02:02.227356 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 16:02:02.227518 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 16:02:02.227706 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 16:02:02.227747 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 16:02:02.227916 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 16:02:02.227978 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 16:02:02.228229 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 16:02:02.228293 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 16:02:02.228549 systemd[1]: Stopped target paths.target - Path Units. Feb 13 16:02:02.228696 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 16:02:02.232464 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 16:02:02.232626 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 16:02:02.232843 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 16:02:02.233011 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 16:02:02.233065 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 16:02:02.233231 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 16:02:02.233274 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 16:02:02.233534 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 16:02:02.233598 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 16:02:02.233840 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 16:02:02.233899 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 16:02:02.242558 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 16:02:02.242892 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 16:02:02.243117 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 16:02:02.245577 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 16:02:02.245839 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 16:02:02.246051 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 16:02:02.246390 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 16:02:02.246639 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 16:02:02.249582 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 16:02:02.250138 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 16:02:02.252196 ignition[1024]: INFO : Ignition 2.20.0 Feb 13 16:02:02.252196 ignition[1024]: INFO : Stage: umount Feb 13 16:02:02.255056 ignition[1024]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 16:02:02.255056 ignition[1024]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 16:02:02.255056 ignition[1024]: INFO : umount: umount passed Feb 13 16:02:02.255056 ignition[1024]: INFO : Ignition finished successfully Feb 13 16:02:02.254791 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 16:02:02.255691 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 16:02:02.256030 systemd[1]: Stopped target network.target - Network. Feb 13 16:02:02.256141 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 16:02:02.256179 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 16:02:02.256299 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 16:02:02.256324 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 16:02:02.256435 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 16:02:02.256470 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 16:02:02.256575 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 16:02:02.256599 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 16:02:02.256784 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 16:02:02.256927 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 16:02:02.262484 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 16:02:02.264730 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 16:02:02.264796 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 16:02:02.266474 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Feb 13 16:02:02.266605 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 16:02:02.266666 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 16:02:02.267820 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Feb 13 16:02:02.268110 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 16:02:02.268142 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 16:02:02.277529 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 16:02:02.277634 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 16:02:02.277667 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 16:02:02.277805 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Feb 13 16:02:02.277830 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 16:02:02.277951 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 16:02:02.277973 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 16:02:02.279128 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 16:02:02.279155 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 16:02:02.279269 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 16:02:02.279292 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 16:02:02.279463 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 16:02:02.280107 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 16:02:02.280147 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 13 16:02:02.285259 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 16:02:02.285324 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 16:02:02.294801 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 16:02:02.294893 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 16:02:02.295180 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 16:02:02.295207 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 16:02:02.295414 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 16:02:02.295430 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 16:02:02.295752 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 16:02:02.295779 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 16:02:02.296020 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 16:02:02.296045 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 16:02:02.296322 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 16:02:02.296348 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 16:02:02.300556 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 16:02:02.300678 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 16:02:02.300705 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 16:02:02.300893 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 16:02:02.300917 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:02:02.301630 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Feb 13 16:02:02.301664 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Feb 13 16:02:02.303587 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 16:02:02.303649 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 16:02:02.345606 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 16:02:02.345670 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 16:02:02.346073 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 16:02:02.346198 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 16:02:02.346229 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 16:02:02.350523 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 16:02:02.360371 systemd[1]: Switching root. Feb 13 16:02:02.395896 systemd-journald[215]: Journal stopped Feb 13 16:02:03.555784 systemd-journald[215]: Received SIGTERM from PID 1 (systemd). Feb 13 16:02:03.555811 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 16:02:03.555820 kernel: SELinux: policy capability open_perms=1 Feb 13 16:02:03.555826 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 16:02:03.555831 kernel: SELinux: policy capability always_check_network=0 Feb 13 16:02:03.555837 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 16:02:03.555844 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 16:02:03.555851 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 16:02:03.555856 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 16:02:03.555862 systemd[1]: Successfully loaded SELinux policy in 30.750ms. Feb 13 16:02:03.555869 kernel: audit: type=1403 audit(1739462522.952:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 16:02:03.555876 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.895ms. Feb 13 16:02:03.555883 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 16:02:03.555891 systemd[1]: Detected virtualization vmware. Feb 13 16:02:03.555898 systemd[1]: Detected architecture x86-64. Feb 13 16:02:03.555904 systemd[1]: Detected first boot. Feb 13 16:02:03.555911 systemd[1]: Initializing machine ID from random generator. Feb 13 16:02:03.555918 zram_generator::config[1069]: No configuration found. Feb 13 16:02:03.556004 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Feb 13 16:02:03.556015 kernel: Guest personality initialized and is active Feb 13 16:02:03.556021 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Feb 13 16:02:03.556027 kernel: Initialized host personality Feb 13 16:02:03.556033 kernel: NET: Registered PF_VSOCK protocol family Feb 13 16:02:03.556040 systemd[1]: Populated /etc with preset unit settings. Feb 13 16:02:03.556050 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 16:02:03.556057 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Feb 13 16:02:03.556065 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Feb 13 16:02:03.556071 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 16:02:03.556077 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 16:02:03.556084 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 16:02:03.556092 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 16:02:03.556099 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 16:02:03.556105 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 16:02:03.556112 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 16:02:03.556119 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 16:02:03.556126 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 16:02:03.556132 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 16:02:03.556139 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 16:02:03.556147 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 16:02:03.556154 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 16:02:03.556162 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 16:02:03.556169 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 16:02:03.556176 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 16:02:03.556183 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 16:02:03.556189 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 16:02:03.556196 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 16:02:03.556204 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 16:02:03.556211 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 16:02:03.556218 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 16:02:03.556225 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 16:02:03.556232 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 16:02:03.556238 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 16:02:03.556245 systemd[1]: Reached target slices.target - Slice Units. Feb 13 16:02:03.556252 systemd[1]: Reached target swap.target - Swaps. Feb 13 16:02:03.556260 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 16:02:03.556267 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 16:02:03.556274 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Feb 13 16:02:03.556281 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 16:02:03.556288 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 16:02:03.556296 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 16:02:03.556303 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 16:02:03.556310 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 16:02:03.556317 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 16:02:03.556324 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 16:02:03.556331 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:02:03.556338 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 16:02:03.556344 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 16:02:03.556352 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 16:02:03.556361 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 16:02:03.559588 systemd[1]: Reached target machines.target - Containers. Feb 13 16:02:03.559606 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 16:02:03.559615 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Feb 13 16:02:03.559622 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 16:02:03.559629 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 16:02:03.559636 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 16:02:03.559646 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 16:02:03.559653 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 16:02:03.559660 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 16:02:03.559668 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 16:02:03.559675 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 16:02:03.559683 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 16:02:03.559690 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 16:02:03.559697 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 16:02:03.559704 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 16:02:03.559714 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 16:02:03.559721 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 16:02:03.559728 kernel: fuse: init (API version 7.39) Feb 13 16:02:03.559735 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 16:02:03.559742 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 16:02:03.559749 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 16:02:03.559756 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Feb 13 16:02:03.559780 systemd-journald[1162]: Collecting audit messages is disabled. Feb 13 16:02:03.559799 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 16:02:03.559806 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 16:02:03.559813 kernel: loop: module loaded Feb 13 16:02:03.559820 systemd[1]: Stopped verity-setup.service. Feb 13 16:02:03.559829 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:02:03.559837 systemd-journald[1162]: Journal started Feb 13 16:02:03.559852 systemd-journald[1162]: Runtime Journal (/run/log/journal/ef1acec29c4f446baadf8a397a3cddad) is 4.8M, max 38.6M, 33.8M free. Feb 13 16:02:03.562470 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 16:02:03.562492 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 16:02:03.374855 systemd[1]: Queued start job for default target multi-user.target. Feb 13 16:02:03.382702 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Feb 13 16:02:03.382953 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 16:02:03.562971 jq[1139]: true Feb 13 16:02:03.563455 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 16:02:03.563666 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 16:02:03.563823 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 16:02:03.563973 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 16:02:03.564106 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 16:02:03.567167 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 16:02:03.567528 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 16:02:03.567650 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 16:02:03.567874 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 16:02:03.567972 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 16:02:03.568215 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 16:02:03.568325 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 16:02:03.568604 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 16:02:03.568695 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 16:02:03.568906 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 16:02:03.568992 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 16:02:03.569241 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 16:02:03.569651 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 16:02:03.569896 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 16:02:03.581604 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 16:02:03.586378 jq[1177]: true Feb 13 16:02:03.586635 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 16:02:03.590512 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 16:02:03.590647 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 16:02:03.590672 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 16:02:03.592153 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Feb 13 16:02:03.594550 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 16:02:03.604558 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 16:02:03.604740 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 16:02:03.610870 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 16:02:03.611663 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 16:02:03.611788 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 16:02:03.612990 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 16:02:03.613107 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 16:02:03.613957 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 16:02:03.615175 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 16:02:03.616330 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 16:02:03.616619 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Feb 13 16:02:03.618830 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 16:02:03.618988 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 16:02:03.619212 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 16:02:03.624211 kernel: ACPI: bus type drm_connector registered Feb 13 16:02:03.630743 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 16:02:03.630861 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 16:02:03.632989 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 16:02:03.653020 systemd-journald[1162]: Time spent on flushing to /var/log/journal/ef1acec29c4f446baadf8a397a3cddad is 40.374ms for 1850 entries. Feb 13 16:02:03.653020 systemd-journald[1162]: System Journal (/var/log/journal/ef1acec29c4f446baadf8a397a3cddad) is 8M, max 584.8M, 576.8M free. Feb 13 16:02:03.699812 systemd-journald[1162]: Received client request to flush runtime journal. Feb 13 16:02:03.699856 kernel: loop0: detected capacity change from 0 to 2960 Feb 13 16:02:03.670908 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 16:02:03.671717 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 16:02:03.678567 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Feb 13 16:02:03.678905 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 16:02:03.701587 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 16:02:03.750424 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Feb 13 16:02:03.755252 ignition[1211]: Ignition 2.20.0 Feb 13 16:02:03.756477 ignition[1211]: deleting config from guestinfo properties Feb 13 16:02:03.769795 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 16:02:03.771727 ignition[1211]: Successfully deleted config Feb 13 16:02:03.778498 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 16:02:03.781688 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 16:02:03.783810 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Feb 13 16:02:03.790765 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 16:02:03.797510 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 16:02:03.798612 udevadm[1237]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Feb 13 16:02:03.799450 kernel: loop1: detected capacity change from 0 to 218376 Feb 13 16:02:03.831722 systemd-tmpfiles[1240]: ACLs are not supported, ignoring. Feb 13 16:02:03.831737 systemd-tmpfiles[1240]: ACLs are not supported, ignoring. Feb 13 16:02:03.840834 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 16:02:03.879461 kernel: loop2: detected capacity change from 0 to 147912 Feb 13 16:02:03.952464 kernel: loop3: detected capacity change from 0 to 138176 Feb 13 16:02:04.039667 kernel: loop4: detected capacity change from 0 to 2960 Feb 13 16:02:04.099460 kernel: loop5: detected capacity change from 0 to 218376 Feb 13 16:02:04.128462 kernel: loop6: detected capacity change from 0 to 147912 Feb 13 16:02:04.172464 kernel: loop7: detected capacity change from 0 to 138176 Feb 13 16:02:04.206214 (sd-merge)[1247]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Feb 13 16:02:04.206510 (sd-merge)[1247]: Merged extensions into '/usr'. Feb 13 16:02:04.209059 systemd[1]: Reload requested from client PID 1210 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 16:02:04.209067 systemd[1]: Reloading... Feb 13 16:02:04.263468 zram_generator::config[1272]: No configuration found. Feb 13 16:02:04.349397 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 16:02:04.368845 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:02:04.411737 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 16:02:04.411999 systemd[1]: Reloading finished in 202 ms. Feb 13 16:02:04.425436 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 16:02:04.432567 systemd[1]: Starting ensure-sysext.service... Feb 13 16:02:04.436333 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 16:02:04.446130 systemd[1]: Reload requested from client PID 1330 ('systemctl') (unit ensure-sysext.service)... Feb 13 16:02:04.446140 systemd[1]: Reloading... Feb 13 16:02:04.476760 systemd-tmpfiles[1331]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 16:02:04.477240 systemd-tmpfiles[1331]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 16:02:04.478044 systemd-tmpfiles[1331]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 16:02:04.478315 systemd-tmpfiles[1331]: ACLs are not supported, ignoring. Feb 13 16:02:04.478509 systemd-tmpfiles[1331]: ACLs are not supported, ignoring. Feb 13 16:02:04.494270 systemd-tmpfiles[1331]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 16:02:04.494483 systemd-tmpfiles[1331]: Skipping /boot Feb 13 16:02:04.501465 zram_generator::config[1356]: No configuration found. Feb 13 16:02:04.505738 systemd-tmpfiles[1331]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 16:02:04.505748 systemd-tmpfiles[1331]: Skipping /boot Feb 13 16:02:04.581511 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 16:02:04.601745 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:02:04.646535 systemd[1]: Reloading finished in 200 ms. Feb 13 16:02:04.660749 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 16:02:04.677340 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 16:02:04.682069 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 16:02:04.686495 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 16:02:04.689577 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 16:02:04.697058 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 16:02:04.708033 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 16:02:04.711616 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 16:02:04.715106 ldconfig[1205]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 16:02:04.714302 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:02:04.716958 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 16:02:04.720672 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 16:02:04.723156 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 16:02:04.723343 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 16:02:04.723430 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 16:02:04.732628 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 16:02:04.732741 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:02:04.734276 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 16:02:04.734645 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 16:02:04.734754 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 16:02:04.740134 systemd-udevd[1425]: Using default interface naming scheme 'v255'. Feb 13 16:02:04.740482 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 16:02:04.741946 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 16:02:04.747057 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 16:02:04.747363 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 16:02:04.748692 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 16:02:04.748801 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 16:02:04.751384 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:02:04.759706 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 16:02:04.761013 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 16:02:04.762262 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 16:02:04.765723 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 16:02:04.765899 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 16:02:04.765970 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 16:02:04.766066 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 16:02:04.766673 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 16:02:04.766792 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 16:02:04.769402 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 16:02:04.779670 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 16:02:04.780047 systemd[1]: Finished ensure-sysext.service. Feb 13 16:02:04.780276 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 16:02:04.780379 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 16:02:04.780647 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 16:02:04.780740 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 16:02:04.781812 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 16:02:04.786805 augenrules[1464]: No rules Feb 13 16:02:04.789720 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 16:02:04.790041 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 16:02:04.790299 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 16:02:04.790646 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 16:02:04.790885 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 16:02:04.790990 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 16:02:04.792186 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 16:02:04.796070 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 16:02:04.842615 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 16:02:04.842826 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 16:02:04.853500 systemd-resolved[1422]: Positive Trust Anchors: Feb 13 16:02:04.853508 systemd-resolved[1422]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 16:02:04.853531 systemd-resolved[1422]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 16:02:04.854239 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 16:02:04.858986 systemd-resolved[1422]: Defaulting to hostname 'linux'. Feb 13 16:02:04.861135 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 16:02:04.861310 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 16:02:04.861930 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 16:02:04.879677 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 16:02:04.880037 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 16:02:04.904695 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 16:02:04.923553 systemd-networkd[1483]: lo: Link UP Feb 13 16:02:04.923744 systemd-networkd[1483]: lo: Gained carrier Feb 13 16:02:04.924282 systemd-networkd[1483]: Enumeration completed Feb 13 16:02:04.924502 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 16:02:04.924769 systemd[1]: Reached target network.target - Network. Feb 13 16:02:04.932570 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Feb 13 16:02:04.935227 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 16:02:04.942457 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Feb 13 16:02:04.949474 kernel: ACPI: button: Power Button [PWRF] Feb 13 16:02:04.957008 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Feb 13 16:02:04.965168 systemd-networkd[1483]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Feb 13 16:02:04.967799 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 16:02:04.967943 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 16:02:04.968800 systemd-networkd[1483]: ens192: Link UP Feb 13 16:02:04.968970 systemd-networkd[1483]: ens192: Gained carrier Feb 13 16:02:04.972039 systemd-timesyncd[1462]: Network configuration changed, trying to establish connection. Feb 13 16:02:04.975453 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1489) Feb 13 16:02:05.031511 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Feb 13 16:02:05.039656 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 16:02:05.045582 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 16:02:05.059483 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Feb 13 16:02:05.065999 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 16:02:05.076516 (udev-worker)[1485]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Feb 13 16:02:05.084648 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 16:02:05.090464 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 16:02:05.106754 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 16:02:05.114599 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 16:02:05.129005 lvm[1522]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 16:02:05.135326 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 16:02:05.158056 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 16:02:05.158690 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 16:02:05.158905 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 16:02:05.159117 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 16:02:05.159291 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 16:02:05.159562 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 16:02:05.159759 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 16:02:05.159921 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 16:02:05.160079 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 16:02:05.160137 systemd[1]: Reached target paths.target - Path Units. Feb 13 16:02:05.160275 systemd[1]: Reached target timers.target - Timer Units. Feb 13 16:02:05.161222 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 16:02:05.162367 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 16:02:05.164059 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Feb 13 16:02:05.164279 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Feb 13 16:02:05.164403 systemd[1]: Reached target ssh-access.target - SSH Access Available. Feb 13 16:02:05.167615 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 16:02:05.167984 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Feb 13 16:02:05.169015 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 16:02:05.169491 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 16:02:05.169637 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 16:02:05.169735 systemd[1]: Reached target basic.target - Basic System. Feb 13 16:02:05.169862 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 16:02:05.169877 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 16:02:05.171600 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 16:02:05.173626 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 16:02:05.174560 lvm[1529]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 16:02:05.178542 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 16:02:05.180140 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 16:02:05.180267 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 16:02:05.183176 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 16:02:05.184725 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Feb 13 16:02:05.186545 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 16:02:05.189967 jq[1532]: false Feb 13 16:02:05.190536 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 16:02:05.192280 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 16:02:05.192878 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 16:02:05.193278 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 16:02:05.196008 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 16:02:05.197373 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 16:02:05.199529 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Feb 13 16:02:05.202667 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 16:02:05.202791 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 16:02:05.208650 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 16:02:05.215889 jq[1541]: true Feb 13 16:02:05.218658 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 16:02:05.218823 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 16:02:05.235693 (ntainerd)[1557]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 16:02:05.237777 extend-filesystems[1533]: Found loop4 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found loop5 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found loop6 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found loop7 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda1 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda2 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda3 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found usr Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda4 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda6 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda7 Feb 13 16:02:05.238557 extend-filesystems[1533]: Found sda9 Feb 13 16:02:05.238557 extend-filesystems[1533]: Checking size of /dev/sda9 Feb 13 16:02:05.240691 jq[1552]: true Feb 13 16:02:05.250678 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 16:02:05.250816 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 16:02:05.257585 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Feb 13 16:02:05.261590 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Feb 13 16:02:05.269172 update_engine[1539]: I20250213 16:02:05.269017 1539 main.cc:92] Flatcar Update Engine starting Feb 13 16:02:05.269355 tar[1551]: linux-amd64/LICENSE Feb 13 16:02:05.269475 tar[1551]: linux-amd64/helm Feb 13 16:02:05.272283 extend-filesystems[1533]: Old size kept for /dev/sda9 Feb 13 16:02:05.272283 extend-filesystems[1533]: Found sr0 Feb 13 16:02:05.272431 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 16:02:05.272581 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 16:02:05.275265 dbus-daemon[1531]: [system] SELinux support is enabled Feb 13 16:02:05.276770 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 16:02:05.279329 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 16:02:05.279350 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 16:02:05.280532 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 16:02:05.280543 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 16:02:05.290584 update_engine[1539]: I20250213 16:02:05.287662 1539 update_check_scheduler.cc:74] Next update check in 7m53s Feb 13 16:02:05.292615 unknown[1570]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Feb 13 16:02:05.292979 systemd[1]: Started update-engine.service - Update Engine. Feb 13 16:02:05.293998 unknown[1570]: Core dump limit set to -1 Feb 13 16:02:05.296583 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 16:02:05.302101 systemd-logind[1538]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 16:02:05.302116 systemd-logind[1538]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 16:02:05.304569 systemd-logind[1538]: New seat seat0. Feb 13 16:02:05.306357 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 16:02:05.328565 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Feb 13 16:02:05.342066 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1488) Feb 13 16:02:05.349533 bash[1591]: Updated "/home/core/.ssh/authorized_keys" Feb 13 16:02:05.350864 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 16:02:05.352079 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Feb 13 16:02:05.459462 locksmithd[1587]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 16:02:05.520989 sshd_keygen[1563]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 16:02:05.535361 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 16:02:05.543656 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 16:02:05.548056 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 16:02:05.548216 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 16:02:05.551579 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 16:02:05.568174 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 16:02:05.574739 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 16:02:05.577655 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 16:02:05.578176 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 16:02:05.600309 containerd[1557]: time="2025-02-13T16:02:05.600259871Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 16:02:05.619415 containerd[1557]: time="2025-02-13T16:02:05.619385486Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622135536Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622153059Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622164292Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622253116Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622263046Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622299068Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622307137Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622415498Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622423589Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622430655Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622813 containerd[1557]: time="2025-02-13T16:02:05.622435869Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622978 containerd[1557]: time="2025-02-13T16:02:05.622489415Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622978 containerd[1557]: time="2025-02-13T16:02:05.622602127Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622978 containerd[1557]: time="2025-02-13T16:02:05.622666317Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 16:02:05.622978 containerd[1557]: time="2025-02-13T16:02:05.622673985Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 16:02:05.622978 containerd[1557]: time="2025-02-13T16:02:05.622713296Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 16:02:05.622978 containerd[1557]: time="2025-02-13T16:02:05.622740241Z" level=info msg="metadata content store policy set" policy=shared Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.660896896Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.660936871Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.660947571Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.660957470Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.660966277Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661045818Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661171960Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661232364Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661241718Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661250500Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661257917Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661265778Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661272247Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661594 containerd[1557]: time="2025-02-13T16:02:05.661280026Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661288059Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661294875Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661301939Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661308612Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661321465Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661329731Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661336291Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661343220Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661349592Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661357243Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661364321Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661371162Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661379555Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.661823 containerd[1557]: time="2025-02-13T16:02:05.661387466Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.662000 containerd[1557]: time="2025-02-13T16:02:05.661393959Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.662000 containerd[1557]: time="2025-02-13T16:02:05.661400755Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.662000 containerd[1557]: time="2025-02-13T16:02:05.661407560Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.662000 containerd[1557]: time="2025-02-13T16:02:05.661415215Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 16:02:05.662000 containerd[1557]: time="2025-02-13T16:02:05.661426419Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.662000 containerd[1557]: time="2025-02-13T16:02:05.661433635Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.661439357Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662433338Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662490420Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662500214Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662507877Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662513666Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662520972Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662526639Z" level=info msg="NRI interface is disabled by configuration." Feb 13 16:02:05.663454 containerd[1557]: time="2025-02-13T16:02:05.662532736Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 16:02:05.663594 containerd[1557]: time="2025-02-13T16:02:05.662693825Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 16:02:05.663594 containerd[1557]: time="2025-02-13T16:02:05.662724965Z" level=info msg="Connect containerd service" Feb 13 16:02:05.663594 containerd[1557]: time="2025-02-13T16:02:05.662741315Z" level=info msg="using legacy CRI server" Feb 13 16:02:05.663594 containerd[1557]: time="2025-02-13T16:02:05.662745558Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 16:02:05.663594 containerd[1557]: time="2025-02-13T16:02:05.662807135Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 16:02:05.664203 containerd[1557]: time="2025-02-13T16:02:05.664190957Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 16:02:05.664451 containerd[1557]: time="2025-02-13T16:02:05.664427746Z" level=info msg="Start subscribing containerd event" Feb 13 16:02:05.665378 containerd[1557]: time="2025-02-13T16:02:05.664994252Z" level=info msg="Start recovering state" Feb 13 16:02:05.665378 containerd[1557]: time="2025-02-13T16:02:05.665061847Z" level=info msg="Start event monitor" Feb 13 16:02:05.665378 containerd[1557]: time="2025-02-13T16:02:05.665074642Z" level=info msg="Start snapshots syncer" Feb 13 16:02:05.665378 containerd[1557]: time="2025-02-13T16:02:05.665080751Z" level=info msg="Start cni network conf syncer for default" Feb 13 16:02:05.665378 containerd[1557]: time="2025-02-13T16:02:05.665085067Z" level=info msg="Start streaming server" Feb 13 16:02:05.666024 containerd[1557]: time="2025-02-13T16:02:05.665392100Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 16:02:05.666024 containerd[1557]: time="2025-02-13T16:02:05.665419685Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 16:02:05.666024 containerd[1557]: time="2025-02-13T16:02:05.665459119Z" level=info msg="containerd successfully booted in 0.065666s" Feb 13 16:02:05.665529 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 16:02:05.778487 tar[1551]: linux-amd64/README.md Feb 13 16:02:05.789431 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Feb 13 16:02:06.548593 systemd-networkd[1483]: ens192: Gained IPv6LL Feb 13 16:02:06.548935 systemd-timesyncd[1462]: Network configuration changed, trying to establish connection. Feb 13 16:02:06.550544 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 16:02:06.550949 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 16:02:06.556612 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Feb 13 16:02:06.557899 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:06.559579 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 16:02:06.573315 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 16:02:06.589479 systemd[1]: coreos-metadata.service: Deactivated successfully. Feb 13 16:02:06.589687 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Feb 13 16:02:06.590295 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 16:02:07.846258 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:07.846751 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 16:02:07.847233 systemd[1]: Startup finished in 985ms (kernel) + 6.328s (initrd) + 4.924s (userspace) = 12.237s. Feb 13 16:02:07.852788 (kubelet)[1709]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:02:08.128520 login[1629]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 16:02:08.130085 login[1633]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 16:02:08.134505 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 16:02:08.142595 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 16:02:08.147953 systemd-logind[1538]: New session 1 of user core. Feb 13 16:02:08.150428 systemd-logind[1538]: New session 2 of user core. Feb 13 16:02:08.153138 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 16:02:08.161752 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 16:02:08.182043 (systemd)[1716]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 16:02:08.183463 systemd-logind[1538]: New session c1 of user core. Feb 13 16:02:08.245732 systemd-timesyncd[1462]: Network configuration changed, trying to establish connection. Feb 13 16:02:08.359482 systemd[1716]: Queued start job for default target default.target. Feb 13 16:02:08.368512 systemd[1716]: Created slice app.slice - User Application Slice. Feb 13 16:02:08.368599 systemd[1716]: Reached target paths.target - Paths. Feb 13 16:02:08.368733 systemd[1716]: Reached target timers.target - Timers. Feb 13 16:02:08.369793 systemd[1716]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 16:02:08.377880 systemd[1716]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 16:02:08.377914 systemd[1716]: Reached target sockets.target - Sockets. Feb 13 16:02:08.377974 systemd[1716]: Reached target basic.target - Basic System. Feb 13 16:02:08.378022 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 16:02:08.378847 systemd[1716]: Reached target default.target - Main User Target. Feb 13 16:02:08.378872 systemd[1716]: Startup finished in 191ms. Feb 13 16:02:08.387582 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 16:02:08.388254 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 16:02:09.079728 kubelet[1709]: E0213 16:02:09.079667 1709 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:02:09.080699 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:02:09.080801 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:02:09.080999 systemd[1]: kubelet.service: Consumed 670ms CPU time, 253M memory peak. Feb 13 16:02:19.331287 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 16:02:19.339609 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:19.640780 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:19.643320 (kubelet)[1760]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:02:19.736268 kubelet[1760]: E0213 16:02:19.736235 1760 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:02:19.738778 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:02:19.738867 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:02:19.739247 systemd[1]: kubelet.service: Consumed 99ms CPU time, 102.6M memory peak. Feb 13 16:02:29.989401 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Feb 13 16:02:29.997562 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:02:30.305034 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:02:30.307474 (kubelet)[1775]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:02:30.359653 kubelet[1775]: E0213 16:02:30.359611 1775 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:02:30.361168 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:02:30.361315 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:02:30.361652 systemd[1]: kubelet.service: Consumed 107ms CPU time, 103.8M memory peak. Feb 13 16:03:52.468862 systemd-resolved[1422]: Clock change detected. Flushing caches. Feb 13 16:03:52.468923 systemd-timesyncd[1462]: Contacted time server 172.245.210.108:123 (2.flatcar.pool.ntp.org). Feb 13 16:03:52.468960 systemd-timesyncd[1462]: Initial clock synchronization to Thu 2025-02-13 16:03:52.468813 UTC. Feb 13 16:03:54.494945 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Feb 13 16:03:54.502796 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:03:54.686462 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:03:54.689106 (kubelet)[1790]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:03:54.716237 kubelet[1790]: E0213 16:03:54.716204 1790 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:03:54.717258 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:03:54.717337 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:03:54.717737 systemd[1]: kubelet.service: Consumed 83ms CPU time, 103.8M memory peak. Feb 13 16:03:59.331422 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 16:03:59.332692 systemd[1]: Started sshd@0-139.178.70.109:22-147.75.109.163:35370.service - OpenSSH per-connection server daemon (147.75.109.163:35370). Feb 13 16:03:59.371291 sshd[1798]: Accepted publickey for core from 147.75.109.163 port 35370 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:59.372027 sshd-session[1798]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:59.375500 systemd-logind[1538]: New session 3 of user core. Feb 13 16:03:59.380735 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 16:03:59.440821 systemd[1]: Started sshd@1-139.178.70.109:22-147.75.109.163:35378.service - OpenSSH per-connection server daemon (147.75.109.163:35378). Feb 13 16:03:59.469920 sshd[1803]: Accepted publickey for core from 147.75.109.163 port 35378 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:59.470637 sshd-session[1803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:59.474069 systemd-logind[1538]: New session 4 of user core. Feb 13 16:03:59.483743 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 16:03:59.532880 sshd[1805]: Connection closed by 147.75.109.163 port 35378 Feb 13 16:03:59.533853 sshd-session[1803]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:59.543576 systemd[1]: sshd@1-139.178.70.109:22-147.75.109.163:35378.service: Deactivated successfully. Feb 13 16:03:59.544883 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 16:03:59.545495 systemd-logind[1538]: Session 4 logged out. Waiting for processes to exit. Feb 13 16:03:59.548944 systemd[1]: Started sshd@2-139.178.70.109:22-147.75.109.163:35394.service - OpenSSH per-connection server daemon (147.75.109.163:35394). Feb 13 16:03:59.550764 systemd-logind[1538]: Removed session 4. Feb 13 16:03:59.580475 sshd[1810]: Accepted publickey for core from 147.75.109.163 port 35394 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:59.581601 sshd-session[1810]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:59.584169 systemd-logind[1538]: New session 5 of user core. Feb 13 16:03:59.595762 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 16:03:59.641426 sshd[1813]: Connection closed by 147.75.109.163 port 35394 Feb 13 16:03:59.642178 sshd-session[1810]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:59.651441 systemd[1]: Started sshd@3-139.178.70.109:22-147.75.109.163:35400.service - OpenSSH per-connection server daemon (147.75.109.163:35400). Feb 13 16:03:59.651784 systemd[1]: sshd@2-139.178.70.109:22-147.75.109.163:35394.service: Deactivated successfully. Feb 13 16:03:59.652692 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 16:03:59.653681 systemd-logind[1538]: Session 5 logged out. Waiting for processes to exit. Feb 13 16:03:59.655079 systemd-logind[1538]: Removed session 5. Feb 13 16:03:59.684034 sshd[1816]: Accepted publickey for core from 147.75.109.163 port 35400 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:59.684958 sshd-session[1816]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:59.688673 systemd-logind[1538]: New session 6 of user core. Feb 13 16:03:59.695802 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 16:03:59.743513 sshd[1821]: Connection closed by 147.75.109.163 port 35400 Feb 13 16:03:59.743839 sshd-session[1816]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:59.755450 systemd[1]: sshd@3-139.178.70.109:22-147.75.109.163:35400.service: Deactivated successfully. Feb 13 16:03:59.756651 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 16:03:59.757264 systemd-logind[1538]: Session 6 logged out. Waiting for processes to exit. Feb 13 16:03:59.761937 systemd[1]: Started sshd@4-139.178.70.109:22-147.75.109.163:35414.service - OpenSSH per-connection server daemon (147.75.109.163:35414). Feb 13 16:03:59.763943 systemd-logind[1538]: Removed session 6. Feb 13 16:03:59.794003 sshd[1826]: Accepted publickey for core from 147.75.109.163 port 35414 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:59.794629 sshd-session[1826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:59.797104 systemd-logind[1538]: New session 7 of user core. Feb 13 16:03:59.804714 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 16:03:59.861024 sudo[1830]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 16:03:59.861260 sudo[1830]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:03:59.874311 sudo[1830]: pam_unix(sudo:session): session closed for user root Feb 13 16:03:59.875091 sshd[1829]: Connection closed by 147.75.109.163 port 35414 Feb 13 16:03:59.876099 sshd-session[1826]: pam_unix(sshd:session): session closed for user core Feb 13 16:03:59.882559 systemd[1]: Started sshd@5-139.178.70.109:22-147.75.109.163:35420.service - OpenSSH per-connection server daemon (147.75.109.163:35420). Feb 13 16:03:59.885019 systemd[1]: sshd@4-139.178.70.109:22-147.75.109.163:35414.service: Deactivated successfully. Feb 13 16:03:59.886021 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 16:03:59.887705 systemd-logind[1538]: Session 7 logged out. Waiting for processes to exit. Feb 13 16:03:59.888602 systemd-logind[1538]: Removed session 7. Feb 13 16:03:59.919884 sshd[1833]: Accepted publickey for core from 147.75.109.163 port 35420 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:03:59.920795 sshd-session[1833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:03:59.924508 systemd-logind[1538]: New session 8 of user core. Feb 13 16:03:59.931740 systemd[1]: Started session-8.scope - Session 8 of User core. Feb 13 16:03:59.981769 sudo[1840]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 16:03:59.981967 sudo[1840]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:03:59.984599 sudo[1840]: pam_unix(sudo:session): session closed for user root Feb 13 16:03:59.988382 sudo[1839]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 16:03:59.988766 sudo[1839]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:03:59.998831 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 16:04:00.013299 augenrules[1862]: No rules Feb 13 16:04:00.013919 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 16:04:00.014121 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 16:04:00.014783 sudo[1839]: pam_unix(sudo:session): session closed for user root Feb 13 16:04:00.015480 sshd[1838]: Connection closed by 147.75.109.163 port 35420 Feb 13 16:04:00.016134 sshd-session[1833]: pam_unix(sshd:session): session closed for user core Feb 13 16:04:00.020598 systemd[1]: sshd@5-139.178.70.109:22-147.75.109.163:35420.service: Deactivated successfully. Feb 13 16:04:00.021885 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 16:04:00.022799 systemd-logind[1538]: Session 8 logged out. Waiting for processes to exit. Feb 13 16:04:00.028103 systemd[1]: Started sshd@6-139.178.70.109:22-147.75.109.163:35430.service - OpenSSH per-connection server daemon (147.75.109.163:35430). Feb 13 16:04:00.029063 systemd-logind[1538]: Removed session 8. Feb 13 16:04:00.056852 sshd[1870]: Accepted publickey for core from 147.75.109.163 port 35430 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:04:00.057462 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:04:00.061129 systemd-logind[1538]: New session 9 of user core. Feb 13 16:04:00.070732 systemd[1]: Started session-9.scope - Session 9 of User core. Feb 13 16:04:00.119475 sudo[1874]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 16:04:00.119701 sudo[1874]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 16:04:00.410860 (dockerd)[1890]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Feb 13 16:04:00.411153 systemd[1]: Starting docker.service - Docker Application Container Engine... Feb 13 16:04:00.674547 dockerd[1890]: time="2025-02-13T16:04:00.674474575Z" level=info msg="Starting up" Feb 13 16:04:00.727399 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1439052969-merged.mount: Deactivated successfully. Feb 13 16:04:00.745509 dockerd[1890]: time="2025-02-13T16:04:00.745488511Z" level=info msg="Loading containers: start." Feb 13 16:04:00.852668 kernel: Initializing XFRM netlink socket Feb 13 16:04:00.904554 systemd-networkd[1483]: docker0: Link UP Feb 13 16:04:00.926507 dockerd[1890]: time="2025-02-13T16:04:00.926447090Z" level=info msg="Loading containers: done." Feb 13 16:04:00.935727 dockerd[1890]: time="2025-02-13T16:04:00.935363125Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 16:04:00.935727 dockerd[1890]: time="2025-02-13T16:04:00.935423549Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Feb 13 16:04:00.935727 dockerd[1890]: time="2025-02-13T16:04:00.935486276Z" level=info msg="Daemon has completed initialization" Feb 13 16:04:00.950996 dockerd[1890]: time="2025-02-13T16:04:00.950974307Z" level=info msg="API listen on /run/docker.sock" Feb 13 16:04:00.951020 systemd[1]: Started docker.service - Docker Application Container Engine. Feb 13 16:04:01.507561 containerd[1557]: time="2025-02-13T16:04:01.507534453Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.2\"" Feb 13 16:04:01.725093 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1424541460-merged.mount: Deactivated successfully. Feb 13 16:04:02.022152 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1638548058.mount: Deactivated successfully. Feb 13 16:04:02.920674 containerd[1557]: time="2025-02-13T16:04:02.920626112Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:02.921470 containerd[1557]: time="2025-02-13T16:04:02.921451039Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.2: active requests=0, bytes read=28673931" Feb 13 16:04:02.921609 containerd[1557]: time="2025-02-13T16:04:02.921596886Z" level=info msg="ImageCreate event name:\"sha256:85b7a174738baecbc53029b7913cd430a2060e0cbdb5f56c7957d32ff7f241ef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:02.923152 containerd[1557]: time="2025-02-13T16:04:02.923132213Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:c47449f3e751588ea0cb74e325e0f83db335a415f4f4c7fb147375dd6c84757f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:02.923790 containerd[1557]: time="2025-02-13T16:04:02.923772132Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.2\" with image id \"sha256:85b7a174738baecbc53029b7913cd430a2060e0cbdb5f56c7957d32ff7f241ef\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.2\", repo digest \"registry.k8s.io/kube-apiserver@sha256:c47449f3e751588ea0cb74e325e0f83db335a415f4f4c7fb147375dd6c84757f\", size \"28670731\" in 1.41621297s" Feb 13 16:04:02.923829 containerd[1557]: time="2025-02-13T16:04:02.923792344Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.2\" returns image reference \"sha256:85b7a174738baecbc53029b7913cd430a2060e0cbdb5f56c7957d32ff7f241ef\"" Feb 13 16:04:02.924190 containerd[1557]: time="2025-02-13T16:04:02.924166726Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.2\"" Feb 13 16:04:04.325712 containerd[1557]: time="2025-02-13T16:04:04.325083602Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:04.326183 containerd[1557]: time="2025-02-13T16:04:04.326059151Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.2: active requests=0, bytes read=24771784" Feb 13 16:04:04.326764 containerd[1557]: time="2025-02-13T16:04:04.326522020Z" level=info msg="ImageCreate event name:\"sha256:b6a454c5a800d201daacead6ff195ec6049fe6dc086621b0670bca912efaf389\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:04.328405 containerd[1557]: time="2025-02-13T16:04:04.328373573Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:399aa50f4d1361c59dc458e634506d02de32613d03a9a614a21058741162ef90\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:04.329209 containerd[1557]: time="2025-02-13T16:04:04.329137389Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.2\" with image id \"sha256:b6a454c5a800d201daacead6ff195ec6049fe6dc086621b0670bca912efaf389\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.2\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:399aa50f4d1361c59dc458e634506d02de32613d03a9a614a21058741162ef90\", size \"26259392\" in 1.404907186s" Feb 13 16:04:04.329209 containerd[1557]: time="2025-02-13T16:04:04.329156689Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.2\" returns image reference \"sha256:b6a454c5a800d201daacead6ff195ec6049fe6dc086621b0670bca912efaf389\"" Feb 13 16:04:04.329569 containerd[1557]: time="2025-02-13T16:04:04.329545273Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.2\"" Feb 13 16:04:04.496045 update_engine[1539]: I20250213 16:04:04.495668 1539 update_attempter.cc:509] Updating boot flags... Feb 13 16:04:04.519652 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2150) Feb 13 16:04:04.563574 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2153) Feb 13 16:04:04.601444 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2153) Feb 13 16:04:04.967620 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Feb 13 16:04:04.973867 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:04:05.031417 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:04:05.033732 (kubelet)[2170]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 16:04:05.056401 kubelet[2170]: E0213 16:04:05.056338 2170 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 16:04:05.057626 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 16:04:05.057720 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 16:04:05.058014 systemd[1]: kubelet.service: Consumed 78ms CPU time, 107.6M memory peak. Feb 13 16:04:05.449723 containerd[1557]: time="2025-02-13T16:04:05.449036797Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:05.455073 containerd[1557]: time="2025-02-13T16:04:05.455040342Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.2: active requests=0, bytes read=19170276" Feb 13 16:04:05.460727 containerd[1557]: time="2025-02-13T16:04:05.460691363Z" level=info msg="ImageCreate event name:\"sha256:d8e673e7c9983f1f53569a9d2ba786c8abb42e3f744f77dc97a595f3caf9435d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:05.467428 containerd[1557]: time="2025-02-13T16:04:05.467382637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:45710d74cfd5aa10a001d0cf81747b77c28617444ffee0503d12f1dcd7450f76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:05.468847 containerd[1557]: time="2025-02-13T16:04:05.468756761Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.2\" with image id \"sha256:d8e673e7c9983f1f53569a9d2ba786c8abb42e3f744f77dc97a595f3caf9435d\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.2\", repo digest \"registry.k8s.io/kube-scheduler@sha256:45710d74cfd5aa10a001d0cf81747b77c28617444ffee0503d12f1dcd7450f76\", size \"20657902\" in 1.13914174s" Feb 13 16:04:05.468847 containerd[1557]: time="2025-02-13T16:04:05.468779137Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.2\" returns image reference \"sha256:d8e673e7c9983f1f53569a9d2ba786c8abb42e3f744f77dc97a595f3caf9435d\"" Feb 13 16:04:05.469297 containerd[1557]: time="2025-02-13T16:04:05.469078866Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.2\"" Feb 13 16:04:06.810753 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount945414097.mount: Deactivated successfully. Feb 13 16:04:07.090298 containerd[1557]: time="2025-02-13T16:04:07.090232355Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:07.090672 containerd[1557]: time="2025-02-13T16:04:07.090512563Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.2: active requests=0, bytes read=30908839" Feb 13 16:04:07.090944 containerd[1557]: time="2025-02-13T16:04:07.090932251Z" level=info msg="ImageCreate event name:\"sha256:f1332858868e1c6a905123b21e2e322ab45a5b99a3532e68ff49a87c2266ebc5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:07.091974 containerd[1557]: time="2025-02-13T16:04:07.091957037Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:83c025f0faa6799fab6645102a98138e39a9a7db2be3bc792c79d72659b1805d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:07.092371 containerd[1557]: time="2025-02-13T16:04:07.092354590Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.2\" with image id \"sha256:f1332858868e1c6a905123b21e2e322ab45a5b99a3532e68ff49a87c2266ebc5\", repo tag \"registry.k8s.io/kube-proxy:v1.32.2\", repo digest \"registry.k8s.io/kube-proxy@sha256:83c025f0faa6799fab6645102a98138e39a9a7db2be3bc792c79d72659b1805d\", size \"30907858\" in 1.623259071s" Feb 13 16:04:07.092402 containerd[1557]: time="2025-02-13T16:04:07.092371912Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.2\" returns image reference \"sha256:f1332858868e1c6a905123b21e2e322ab45a5b99a3532e68ff49a87c2266ebc5\"" Feb 13 16:04:07.092716 containerd[1557]: time="2025-02-13T16:04:07.092701942Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Feb 13 16:04:07.628372 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount236269815.mount: Deactivated successfully. Feb 13 16:04:08.514296 containerd[1557]: time="2025-02-13T16:04:08.514257259Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:08.515255 containerd[1557]: time="2025-02-13T16:04:08.514925198Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565241" Feb 13 16:04:08.515255 containerd[1557]: time="2025-02-13T16:04:08.515029770Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:08.516710 containerd[1557]: time="2025-02-13T16:04:08.516697027Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:08.517402 containerd[1557]: time="2025-02-13T16:04:08.517384664Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.424665446s" Feb 13 16:04:08.517430 containerd[1557]: time="2025-02-13T16:04:08.517403423Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Feb 13 16:04:08.517670 containerd[1557]: time="2025-02-13T16:04:08.517658548Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Feb 13 16:04:09.067079 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount176375334.mount: Deactivated successfully. Feb 13 16:04:09.068573 containerd[1557]: time="2025-02-13T16:04:09.068540294Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:09.069025 containerd[1557]: time="2025-02-13T16:04:09.068994442Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" Feb 13 16:04:09.069692 containerd[1557]: time="2025-02-13T16:04:09.069126445Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:09.070676 containerd[1557]: time="2025-02-13T16:04:09.070661634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:09.071238 containerd[1557]: time="2025-02-13T16:04:09.071224381Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 553.551619ms" Feb 13 16:04:09.071296 containerd[1557]: time="2025-02-13T16:04:09.071281132Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Feb 13 16:04:09.071711 containerd[1557]: time="2025-02-13T16:04:09.071700320Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Feb 13 16:04:09.711139 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount519024180.mount: Deactivated successfully. Feb 13 16:04:12.315038 containerd[1557]: time="2025-02-13T16:04:12.315005957Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:12.316024 containerd[1557]: time="2025-02-13T16:04:12.315997033Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551320" Feb 13 16:04:12.316482 containerd[1557]: time="2025-02-13T16:04:12.316272547Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:12.318053 containerd[1557]: time="2025-02-13T16:04:12.318033460Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:12.318820 containerd[1557]: time="2025-02-13T16:04:12.318803690Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.247051015s" Feb 13 16:04:12.318859 containerd[1557]: time="2025-02-13T16:04:12.318822496Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Feb 13 16:04:14.263509 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:04:14.263739 systemd[1]: kubelet.service: Consumed 78ms CPU time, 107.6M memory peak. Feb 13 16:04:14.267771 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:04:14.289022 systemd[1]: Reload requested from client PID 2326 ('systemctl') (unit session-9.scope)... Feb 13 16:04:14.289031 systemd[1]: Reloading... Feb 13 16:04:14.366672 zram_generator::config[2370]: No configuration found. Feb 13 16:04:14.423601 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 16:04:14.442407 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:04:14.505793 systemd[1]: Reloading finished in 216 ms. Feb 13 16:04:14.543983 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Feb 13 16:04:14.544061 systemd[1]: kubelet.service: Failed with result 'signal'. Feb 13 16:04:14.544358 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:04:14.544404 systemd[1]: kubelet.service: Consumed 41ms CPU time, 72.4M memory peak. Feb 13 16:04:14.549937 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:04:14.911625 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:04:14.918853 (kubelet)[2438]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 16:04:14.965595 kubelet[2438]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:04:14.965595 kubelet[2438]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Feb 13 16:04:14.965595 kubelet[2438]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:04:14.965881 kubelet[2438]: I0213 16:04:14.965631 2438 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 16:04:15.286134 kubelet[2438]: I0213 16:04:15.286113 2438 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Feb 13 16:04:15.286134 kubelet[2438]: I0213 16:04:15.286134 2438 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 16:04:15.286311 kubelet[2438]: I0213 16:04:15.286297 2438 server.go:954] "Client rotation is on, will bootstrap in background" Feb 13 16:04:15.311088 kubelet[2438]: E0213 16:04:15.311063 2438 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://139.178.70.109:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:15.311681 kubelet[2438]: I0213 16:04:15.311668 2438 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 16:04:15.323060 kubelet[2438]: E0213 16:04:15.323038 2438 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Feb 13 16:04:15.323060 kubelet[2438]: I0213 16:04:15.323054 2438 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Feb 13 16:04:15.325712 kubelet[2438]: I0213 16:04:15.325700 2438 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 16:04:15.328408 kubelet[2438]: I0213 16:04:15.328386 2438 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 16:04:15.328505 kubelet[2438]: I0213 16:04:15.328408 2438 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 13 16:04:15.329796 kubelet[2438]: I0213 16:04:15.329780 2438 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 16:04:15.329796 kubelet[2438]: I0213 16:04:15.329793 2438 container_manager_linux.go:304] "Creating device plugin manager" Feb 13 16:04:15.329880 kubelet[2438]: I0213 16:04:15.329866 2438 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:04:15.333179 kubelet[2438]: I0213 16:04:15.333169 2438 kubelet.go:446] "Attempting to sync node with API server" Feb 13 16:04:15.333205 kubelet[2438]: I0213 16:04:15.333181 2438 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 16:04:15.333205 kubelet[2438]: I0213 16:04:15.333191 2438 kubelet.go:352] "Adding apiserver pod source" Feb 13 16:04:15.333205 kubelet[2438]: I0213 16:04:15.333196 2438 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 16:04:15.336976 kubelet[2438]: I0213 16:04:15.336880 2438 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 16:04:15.339585 kubelet[2438]: I0213 16:04:15.339496 2438 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 16:04:15.340606 kubelet[2438]: W0213 16:04:15.340012 2438 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 16:04:15.340606 kubelet[2438]: I0213 16:04:15.340362 2438 watchdog_linux.go:99] "Systemd watchdog is not enabled" Feb 13 16:04:15.340606 kubelet[2438]: I0213 16:04:15.340379 2438 server.go:1287] "Started kubelet" Feb 13 16:04:15.340606 kubelet[2438]: W0213 16:04:15.340459 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.109:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:15.340606 kubelet[2438]: E0213 16:04:15.340496 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.109:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:15.340606 kubelet[2438]: W0213 16:04:15.340538 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:15.340606 kubelet[2438]: E0213 16:04:15.340556 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:15.347270 kubelet[2438]: I0213 16:04:15.347052 2438 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 16:04:15.347270 kubelet[2438]: I0213 16:04:15.347251 2438 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 16:04:15.349357 kubelet[2438]: I0213 16:04:15.348894 2438 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 16:04:15.351863 kubelet[2438]: E0213 16:04:15.348070 2438 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.109:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.109:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1823d011a79ecc23 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-02-13 16:04:15.340366883 +0000 UTC m=+0.419206910,LastTimestamp:2025-02-13 16:04:15.340366883 +0000 UTC m=+0.419206910,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Feb 13 16:04:15.351863 kubelet[2438]: I0213 16:04:15.351496 2438 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 16:04:15.353956 kubelet[2438]: I0213 16:04:15.353940 2438 server.go:490] "Adding debug handlers to kubelet server" Feb 13 16:04:15.354669 kubelet[2438]: I0213 16:04:15.354591 2438 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Feb 13 16:04:15.355694 kubelet[2438]: I0213 16:04:15.355682 2438 volume_manager.go:297] "Starting Kubelet Volume Manager" Feb 13 16:04:15.355836 kubelet[2438]: E0213 16:04:15.355823 2438 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 16:04:15.357315 kubelet[2438]: E0213 16:04:15.357212 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="200ms" Feb 13 16:04:15.357466 kubelet[2438]: I0213 16:04:15.357449 2438 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 16:04:15.359453 kubelet[2438]: I0213 16:04:15.359381 2438 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 16:04:15.359635 kubelet[2438]: I0213 16:04:15.359624 2438 reconciler.go:26] "Reconciler: start to sync state" Feb 13 16:04:15.359893 kubelet[2438]: W0213 16:04:15.359867 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:15.359933 kubelet[2438]: E0213 16:04:15.359909 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:15.364695 kubelet[2438]: I0213 16:04:15.362082 2438 factory.go:221] Registration of the containerd container factory successfully Feb 13 16:04:15.364695 kubelet[2438]: I0213 16:04:15.362091 2438 factory.go:221] Registration of the systemd container factory successfully Feb 13 16:04:15.369253 kubelet[2438]: I0213 16:04:15.369226 2438 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 16:04:15.369891 kubelet[2438]: I0213 16:04:15.369879 2438 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 16:04:15.369891 kubelet[2438]: I0213 16:04:15.369891 2438 status_manager.go:227] "Starting to sync pod status with apiserver" Feb 13 16:04:15.369944 kubelet[2438]: I0213 16:04:15.369903 2438 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Feb 13 16:04:15.369944 kubelet[2438]: I0213 16:04:15.369907 2438 kubelet.go:2388] "Starting kubelet main sync loop" Feb 13 16:04:15.369944 kubelet[2438]: E0213 16:04:15.369933 2438 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 16:04:15.373302 kubelet[2438]: W0213 16:04:15.373274 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:15.373344 kubelet[2438]: E0213 16:04:15.373306 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:15.380226 kubelet[2438]: I0213 16:04:15.380206 2438 cpu_manager.go:221] "Starting CPU manager" policy="none" Feb 13 16:04:15.380226 kubelet[2438]: I0213 16:04:15.380219 2438 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Feb 13 16:04:15.380226 kubelet[2438]: I0213 16:04:15.380229 2438 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:04:15.381148 kubelet[2438]: I0213 16:04:15.381136 2438 policy_none.go:49] "None policy: Start" Feb 13 16:04:15.381148 kubelet[2438]: I0213 16:04:15.381148 2438 memory_manager.go:186] "Starting memorymanager" policy="None" Feb 13 16:04:15.381191 kubelet[2438]: I0213 16:04:15.381154 2438 state_mem.go:35] "Initializing new in-memory state store" Feb 13 16:04:15.384724 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 16:04:15.396511 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 16:04:15.398854 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 16:04:15.409106 kubelet[2438]: I0213 16:04:15.409086 2438 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 16:04:15.409325 kubelet[2438]: I0213 16:04:15.409191 2438 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 13 16:04:15.409325 kubelet[2438]: I0213 16:04:15.409199 2438 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 16:04:15.409375 kubelet[2438]: I0213 16:04:15.409348 2438 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 16:04:15.410686 kubelet[2438]: E0213 16:04:15.410674 2438 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Feb 13 16:04:15.410723 kubelet[2438]: E0213 16:04:15.410695 2438 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Feb 13 16:04:15.476968 systemd[1]: Created slice kubepods-burstable-podc72911152bbceda2f57fd8d59261e015.slice - libcontainer container kubepods-burstable-podc72911152bbceda2f57fd8d59261e015.slice. Feb 13 16:04:15.489666 kubelet[2438]: E0213 16:04:15.489589 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:15.491948 systemd[1]: Created slice kubepods-burstable-pod46fdfbe66f40ad033f38495c1d8bd526.slice - libcontainer container kubepods-burstable-pod46fdfbe66f40ad033f38495c1d8bd526.slice. Feb 13 16:04:15.496467 kubelet[2438]: E0213 16:04:15.496448 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:15.499077 systemd[1]: Created slice kubepods-burstable-pod95ef9ac46cd4dbaadc63cb713310ae59.slice - libcontainer container kubepods-burstable-pod95ef9ac46cd4dbaadc63cb713310ae59.slice. Feb 13 16:04:15.500256 kubelet[2438]: E0213 16:04:15.500238 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:15.509950 kubelet[2438]: I0213 16:04:15.509938 2438 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 16:04:15.510187 kubelet[2438]: E0213 16:04:15.510170 2438 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Feb 13 16:04:15.557892 kubelet[2438]: E0213 16:04:15.557829 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="400ms" Feb 13 16:04:15.660676 kubelet[2438]: I0213 16:04:15.660434 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:15.660676 kubelet[2438]: I0213 16:04:15.660473 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:15.660676 kubelet[2438]: I0213 16:04:15.660490 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/46fdfbe66f40ad033f38495c1d8bd526-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"46fdfbe66f40ad033f38495c1d8bd526\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:15.660676 kubelet[2438]: I0213 16:04:15.660503 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/46fdfbe66f40ad033f38495c1d8bd526-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"46fdfbe66f40ad033f38495c1d8bd526\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:15.660676 kubelet[2438]: I0213 16:04:15.660516 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:15.660919 kubelet[2438]: I0213 16:04:15.660527 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:15.660919 kubelet[2438]: I0213 16:04:15.660539 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:15.660919 kubelet[2438]: I0213 16:04:15.660550 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/95ef9ac46cd4dbaadc63cb713310ae59-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"95ef9ac46cd4dbaadc63cb713310ae59\") " pod="kube-system/kube-scheduler-localhost" Feb 13 16:04:15.660919 kubelet[2438]: I0213 16:04:15.660577 2438 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/46fdfbe66f40ad033f38495c1d8bd526-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"46fdfbe66f40ad033f38495c1d8bd526\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:15.711491 kubelet[2438]: I0213 16:04:15.711257 2438 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 16:04:15.711491 kubelet[2438]: E0213 16:04:15.711446 2438 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Feb 13 16:04:15.791237 containerd[1557]: time="2025-02-13T16:04:15.791203743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:c72911152bbceda2f57fd8d59261e015,Namespace:kube-system,Attempt:0,}" Feb 13 16:04:15.797871 containerd[1557]: time="2025-02-13T16:04:15.797112564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:46fdfbe66f40ad033f38495c1d8bd526,Namespace:kube-system,Attempt:0,}" Feb 13 16:04:15.801593 containerd[1557]: time="2025-02-13T16:04:15.801551259Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:95ef9ac46cd4dbaadc63cb713310ae59,Namespace:kube-system,Attempt:0,}" Feb 13 16:04:15.958882 kubelet[2438]: E0213 16:04:15.958815 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="800ms" Feb 13 16:04:16.112630 kubelet[2438]: I0213 16:04:16.112608 2438 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 16:04:16.112901 kubelet[2438]: E0213 16:04:16.112844 2438 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Feb 13 16:04:16.287981 kubelet[2438]: W0213 16:04:16.287941 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:16.288058 kubelet[2438]: E0213 16:04:16.287988 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://139.178.70.109:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:16.293505 kubelet[2438]: W0213 16:04:16.293433 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:16.293505 kubelet[2438]: E0213 16:04:16.293454 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://139.178.70.109:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:16.304030 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1803200723.mount: Deactivated successfully. Feb 13 16:04:16.307104 containerd[1557]: time="2025-02-13T16:04:16.306168136Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:04:16.307104 containerd[1557]: time="2025-02-13T16:04:16.306697499Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:04:16.307104 containerd[1557]: time="2025-02-13T16:04:16.307083555Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 16:04:16.307423 containerd[1557]: time="2025-02-13T16:04:16.307409517Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:04:16.307606 containerd[1557]: time="2025-02-13T16:04:16.307587070Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 16:04:16.308340 containerd[1557]: time="2025-02-13T16:04:16.308321738Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 16:04:16.308389 containerd[1557]: time="2025-02-13T16:04:16.308374945Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:04:16.310389 containerd[1557]: time="2025-02-13T16:04:16.310375179Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 16:04:16.310872 containerd[1557]: time="2025-02-13T16:04:16.310861107Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 514.992559ms" Feb 13 16:04:16.311670 containerd[1557]: time="2025-02-13T16:04:16.311653643Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 514.492958ms" Feb 13 16:04:16.313449 containerd[1557]: time="2025-02-13T16:04:16.313087436Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 511.479419ms" Feb 13 16:04:16.337723 kubelet[2438]: W0213 16:04:16.337503 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:16.337723 kubelet[2438]: E0213 16:04:16.337544 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://139.178.70.109:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:16.578253 containerd[1557]: time="2025-02-13T16:04:16.577478455Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:16.578253 containerd[1557]: time="2025-02-13T16:04:16.577512377Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:16.578253 containerd[1557]: time="2025-02-13T16:04:16.577519809Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:16.578253 containerd[1557]: time="2025-02-13T16:04:16.577565006Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:16.578958 containerd[1557]: time="2025-02-13T16:04:16.577460003Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:16.578958 containerd[1557]: time="2025-02-13T16:04:16.578552429Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:16.578958 containerd[1557]: time="2025-02-13T16:04:16.578564357Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:16.578958 containerd[1557]: time="2025-02-13T16:04:16.578675970Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:16.582900 containerd[1557]: time="2025-02-13T16:04:16.581813367Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:16.582900 containerd[1557]: time="2025-02-13T16:04:16.581904563Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:16.582900 containerd[1557]: time="2025-02-13T16:04:16.581955546Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:16.583336 containerd[1557]: time="2025-02-13T16:04:16.582016473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:16.607730 systemd[1]: Started cri-containerd-d0f8e4640bf429aba250a77b9b56b48c5a1c1011bd9b2d67b0c87ffd26cb60ff.scope - libcontainer container d0f8e4640bf429aba250a77b9b56b48c5a1c1011bd9b2d67b0c87ffd26cb60ff. Feb 13 16:04:16.611519 systemd[1]: Started cri-containerd-4d5951b4408a6643dc13273434cadfc3be861d8be989e1032f8510d0c023014b.scope - libcontainer container 4d5951b4408a6643dc13273434cadfc3be861d8be989e1032f8510d0c023014b. Feb 13 16:04:16.613028 systemd[1]: Started cri-containerd-7794008650b0d6a5ebeb6131698a58a51fddf49f536d0dbc4c0ce83102dcb481.scope - libcontainer container 7794008650b0d6a5ebeb6131698a58a51fddf49f536d0dbc4c0ce83102dcb481. Feb 13 16:04:16.644503 containerd[1557]: time="2025-02-13T16:04:16.644480466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:46fdfbe66f40ad033f38495c1d8bd526,Namespace:kube-system,Attempt:0,} returns sandbox id \"d0f8e4640bf429aba250a77b9b56b48c5a1c1011bd9b2d67b0c87ffd26cb60ff\"" Feb 13 16:04:16.649262 containerd[1557]: time="2025-02-13T16:04:16.649188844Z" level=info msg="CreateContainer within sandbox \"d0f8e4640bf429aba250a77b9b56b48c5a1c1011bd9b2d67b0c87ffd26cb60ff\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 16:04:16.665768 containerd[1557]: time="2025-02-13T16:04:16.665212588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:c72911152bbceda2f57fd8d59261e015,Namespace:kube-system,Attempt:0,} returns sandbox id \"7794008650b0d6a5ebeb6131698a58a51fddf49f536d0dbc4c0ce83102dcb481\"" Feb 13 16:04:16.666939 containerd[1557]: time="2025-02-13T16:04:16.666925215Z" level=info msg="CreateContainer within sandbox \"d0f8e4640bf429aba250a77b9b56b48c5a1c1011bd9b2d67b0c87ffd26cb60ff\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e243ac66106d2cb198907b6a625cf9d14015af13d20ccd3e20da58419d7ad430\"" Feb 13 16:04:16.668349 containerd[1557]: time="2025-02-13T16:04:16.668069564Z" level=info msg="CreateContainer within sandbox \"7794008650b0d6a5ebeb6131698a58a51fddf49f536d0dbc4c0ce83102dcb481\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 16:04:16.668394 containerd[1557]: time="2025-02-13T16:04:16.668361441Z" level=info msg="StartContainer for \"e243ac66106d2cb198907b6a625cf9d14015af13d20ccd3e20da58419d7ad430\"" Feb 13 16:04:16.673259 containerd[1557]: time="2025-02-13T16:04:16.673242890Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:95ef9ac46cd4dbaadc63cb713310ae59,Namespace:kube-system,Attempt:0,} returns sandbox id \"4d5951b4408a6643dc13273434cadfc3be861d8be989e1032f8510d0c023014b\"" Feb 13 16:04:16.675475 containerd[1557]: time="2025-02-13T16:04:16.675454867Z" level=info msg="CreateContainer within sandbox \"4d5951b4408a6643dc13273434cadfc3be861d8be989e1032f8510d0c023014b\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 16:04:16.683440 containerd[1557]: time="2025-02-13T16:04:16.683411336Z" level=info msg="CreateContainer within sandbox \"7794008650b0d6a5ebeb6131698a58a51fddf49f536d0dbc4c0ce83102dcb481\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"77ea3038a499c1cb0c6414f4a9c2ad7b64494b4148fe79b80973dea0145fe268\"" Feb 13 16:04:16.683823 containerd[1557]: time="2025-02-13T16:04:16.683800718Z" level=info msg="StartContainer for \"77ea3038a499c1cb0c6414f4a9c2ad7b64494b4148fe79b80973dea0145fe268\"" Feb 13 16:04:16.691585 containerd[1557]: time="2025-02-13T16:04:16.691565992Z" level=info msg="CreateContainer within sandbox \"4d5951b4408a6643dc13273434cadfc3be861d8be989e1032f8510d0c023014b\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"ab322c25f7ad64ac80cd92b4e8dcdb856a344217ee4a6cef27c46f7a8cb80798\"" Feb 13 16:04:16.692471 containerd[1557]: time="2025-02-13T16:04:16.692449169Z" level=info msg="StartContainer for \"ab322c25f7ad64ac80cd92b4e8dcdb856a344217ee4a6cef27c46f7a8cb80798\"" Feb 13 16:04:16.695080 systemd[1]: Started cri-containerd-e243ac66106d2cb198907b6a625cf9d14015af13d20ccd3e20da58419d7ad430.scope - libcontainer container e243ac66106d2cb198907b6a625cf9d14015af13d20ccd3e20da58419d7ad430. Feb 13 16:04:16.708772 systemd[1]: Started cri-containerd-77ea3038a499c1cb0c6414f4a9c2ad7b64494b4148fe79b80973dea0145fe268.scope - libcontainer container 77ea3038a499c1cb0c6414f4a9c2ad7b64494b4148fe79b80973dea0145fe268. Feb 13 16:04:16.717730 systemd[1]: Started cri-containerd-ab322c25f7ad64ac80cd92b4e8dcdb856a344217ee4a6cef27c46f7a8cb80798.scope - libcontainer container ab322c25f7ad64ac80cd92b4e8dcdb856a344217ee4a6cef27c46f7a8cb80798. Feb 13 16:04:16.734482 containerd[1557]: time="2025-02-13T16:04:16.734364995Z" level=info msg="StartContainer for \"e243ac66106d2cb198907b6a625cf9d14015af13d20ccd3e20da58419d7ad430\" returns successfully" Feb 13 16:04:16.735919 kubelet[2438]: W0213 16:04:16.735474 2438 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.109:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 139.178.70.109:6443: connect: connection refused Feb 13 16:04:16.735919 kubelet[2438]: E0213 16:04:16.735512 2438 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://139.178.70.109:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 139.178.70.109:6443: connect: connection refused" logger="UnhandledError" Feb 13 16:04:16.755542 containerd[1557]: time="2025-02-13T16:04:16.755522157Z" level=info msg="StartContainer for \"77ea3038a499c1cb0c6414f4a9c2ad7b64494b4148fe79b80973dea0145fe268\" returns successfully" Feb 13 16:04:16.759157 kubelet[2438]: E0213 16:04:16.759137 2438 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.109:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.109:6443: connect: connection refused" interval="1.6s" Feb 13 16:04:16.767126 containerd[1557]: time="2025-02-13T16:04:16.767037730Z" level=info msg="StartContainer for \"ab322c25f7ad64ac80cd92b4e8dcdb856a344217ee4a6cef27c46f7a8cb80798\" returns successfully" Feb 13 16:04:16.915005 kubelet[2438]: I0213 16:04:16.914951 2438 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 16:04:16.915312 kubelet[2438]: E0213 16:04:16.915293 2438 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://139.178.70.109:6443/api/v1/nodes\": dial tcp 139.178.70.109:6443: connect: connection refused" node="localhost" Feb 13 16:04:17.384719 kubelet[2438]: E0213 16:04:17.384595 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:17.385654 kubelet[2438]: E0213 16:04:17.385041 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:17.386905 kubelet[2438]: E0213 16:04:17.386893 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:18.338374 kubelet[2438]: I0213 16:04:18.338345 2438 apiserver.go:52] "Watching apiserver" Feb 13 16:04:18.359509 kubelet[2438]: I0213 16:04:18.359484 2438 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 16:04:18.361787 kubelet[2438]: E0213 16:04:18.361765 2438 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Feb 13 16:04:18.391118 kubelet[2438]: E0213 16:04:18.391100 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:18.391609 kubelet[2438]: E0213 16:04:18.391430 2438 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" Feb 13 16:04:18.438497 kubelet[2438]: E0213 16:04:18.438470 2438 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Feb 13 16:04:18.517045 kubelet[2438]: I0213 16:04:18.516998 2438 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 16:04:18.523221 kubelet[2438]: I0213 16:04:18.523209 2438 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Feb 13 16:04:18.556469 kubelet[2438]: I0213 16:04:18.556447 2438 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:18.560929 kubelet[2438]: E0213 16:04:18.560897 2438 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:18.560929 kubelet[2438]: I0213 16:04:18.560914 2438 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Feb 13 16:04:18.563648 kubelet[2438]: E0213 16:04:18.562467 2438 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" Feb 13 16:04:18.563778 kubelet[2438]: I0213 16:04:18.562478 2438 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:18.565388 kubelet[2438]: E0213 16:04:18.565376 2438 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:19.950746 systemd[1]: Reload requested from client PID 2710 ('systemctl') (unit session-9.scope)... Feb 13 16:04:19.950762 systemd[1]: Reloading... Feb 13 16:04:20.003670 zram_generator::config[2754]: No configuration found. Feb 13 16:04:20.074533 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 16:04:20.092540 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 16:04:20.174982 systemd[1]: Reloading finished in 223 ms. Feb 13 16:04:20.191274 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:04:20.205345 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 16:04:20.205506 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:04:20.205542 systemd[1]: kubelet.service: Consumed 565ms CPU time, 121.3M memory peak. Feb 13 16:04:20.210894 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 16:04:20.565625 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 16:04:20.569537 (kubelet)[2822]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 16:04:20.627132 kubelet[2822]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:04:20.627132 kubelet[2822]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Feb 13 16:04:20.627132 kubelet[2822]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 16:04:20.627375 kubelet[2822]: I0213 16:04:20.627162 2822 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 16:04:20.631431 kubelet[2822]: I0213 16:04:20.631327 2822 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" Feb 13 16:04:20.631431 kubelet[2822]: I0213 16:04:20.631360 2822 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 16:04:20.631624 kubelet[2822]: I0213 16:04:20.631616 2822 server.go:954] "Client rotation is on, will bootstrap in background" Feb 13 16:04:20.632969 kubelet[2822]: I0213 16:04:20.632959 2822 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 16:04:20.636321 kubelet[2822]: I0213 16:04:20.636312 2822 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 16:04:20.645923 kubelet[2822]: E0213 16:04:20.645900 2822 log.go:32] "RuntimeConfig from runtime service failed" err="rpc error: code = Unimplemented desc = unknown method RuntimeConfig for service runtime.v1.RuntimeService" Feb 13 16:04:20.645923 kubelet[2822]: I0213 16:04:20.645921 2822 server.go:1421] "CRI implementation should be updated to support RuntimeConfig when KubeletCgroupDriverFromCRI feature gate has been enabled. Falling back to using cgroupDriver from kubelet config." Feb 13 16:04:20.647858 kubelet[2822]: I0213 16:04:20.647671 2822 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 16:04:20.648196 kubelet[2822]: I0213 16:04:20.648174 2822 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 16:04:20.648386 kubelet[2822]: I0213 16:04:20.648234 2822 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 13 16:04:20.648462 kubelet[2822]: I0213 16:04:20.648456 2822 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 16:04:20.648497 kubelet[2822]: I0213 16:04:20.648493 2822 container_manager_linux.go:304] "Creating device plugin manager" Feb 13 16:04:20.648679 kubelet[2822]: I0213 16:04:20.648547 2822 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:04:20.650977 kubelet[2822]: I0213 16:04:20.650967 2822 kubelet.go:446] "Attempting to sync node with API server" Feb 13 16:04:20.651042 kubelet[2822]: I0213 16:04:20.651035 2822 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 16:04:20.651084 kubelet[2822]: I0213 16:04:20.651080 2822 kubelet.go:352] "Adding apiserver pod source" Feb 13 16:04:20.651663 kubelet[2822]: I0213 16:04:20.651114 2822 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 16:04:20.655021 kubelet[2822]: I0213 16:04:20.654567 2822 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 16:04:20.655021 kubelet[2822]: I0213 16:04:20.654833 2822 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 16:04:20.657017 kubelet[2822]: I0213 16:04:20.656038 2822 watchdog_linux.go:99] "Systemd watchdog is not enabled" Feb 13 16:04:20.657017 kubelet[2822]: I0213 16:04:20.656059 2822 server.go:1287] "Started kubelet" Feb 13 16:04:20.657017 kubelet[2822]: I0213 16:04:20.656938 2822 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 16:04:20.659263 kubelet[2822]: I0213 16:04:20.658631 2822 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 16:04:20.663665 kubelet[2822]: I0213 16:04:20.663426 2822 server.go:490] "Adding debug handlers to kubelet server" Feb 13 16:04:20.663943 kubelet[2822]: I0213 16:04:20.663856 2822 volume_manager.go:297] "Starting Kubelet Volume Manager" Feb 13 16:04:20.664125 kubelet[2822]: I0213 16:04:20.664034 2822 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Feb 13 16:04:20.667595 kubelet[2822]: I0213 16:04:20.667573 2822 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 16:04:20.667595 kubelet[2822]: I0213 16:04:20.663922 2822 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 16:04:20.667752 kubelet[2822]: I0213 16:04:20.667739 2822 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 16:04:20.667838 kubelet[2822]: I0213 16:04:20.667827 2822 reconciler.go:26] "Reconciler: start to sync state" Feb 13 16:04:20.670271 kubelet[2822]: I0213 16:04:20.669501 2822 factory.go:221] Registration of the systemd container factory successfully Feb 13 16:04:20.670271 kubelet[2822]: I0213 16:04:20.669551 2822 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 16:04:20.670271 kubelet[2822]: I0213 16:04:20.670040 2822 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 16:04:20.670756 kubelet[2822]: I0213 16:04:20.670710 2822 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 16:04:20.671320 kubelet[2822]: I0213 16:04:20.671304 2822 status_manager.go:227] "Starting to sync pod status with apiserver" Feb 13 16:04:20.671350 kubelet[2822]: I0213 16:04:20.671324 2822 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Feb 13 16:04:20.671350 kubelet[2822]: I0213 16:04:20.671328 2822 kubelet.go:2388] "Starting kubelet main sync loop" Feb 13 16:04:20.672587 kubelet[2822]: E0213 16:04:20.672318 2822 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 16:04:20.673674 kubelet[2822]: I0213 16:04:20.673576 2822 factory.go:221] Registration of the containerd container factory successfully Feb 13 16:04:20.677214 kubelet[2822]: E0213 16:04:20.677199 2822 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 16:04:20.693979 kubelet[2822]: I0213 16:04:20.693961 2822 cpu_manager.go:221] "Starting CPU manager" policy="none" Feb 13 16:04:20.693979 kubelet[2822]: I0213 16:04:20.693974 2822 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Feb 13 16:04:20.694086 kubelet[2822]: I0213 16:04:20.694039 2822 state_mem.go:36] "Initialized new in-memory state store" Feb 13 16:04:20.694143 kubelet[2822]: I0213 16:04:20.694130 2822 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 16:04:20.694167 kubelet[2822]: I0213 16:04:20.694141 2822 state_mem.go:96] "Updated CPUSet assignments" assignments={} Feb 13 16:04:20.694167 kubelet[2822]: I0213 16:04:20.694152 2822 policy_none.go:49] "None policy: Start" Feb 13 16:04:20.694167 kubelet[2822]: I0213 16:04:20.694158 2822 memory_manager.go:186] "Starting memorymanager" policy="None" Feb 13 16:04:20.694167 kubelet[2822]: I0213 16:04:20.694163 2822 state_mem.go:35] "Initializing new in-memory state store" Feb 13 16:04:20.694223 kubelet[2822]: I0213 16:04:20.694218 2822 state_mem.go:75] "Updated machine memory state" Feb 13 16:04:20.696273 kubelet[2822]: I0213 16:04:20.696258 2822 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 16:04:20.696409 kubelet[2822]: I0213 16:04:20.696384 2822 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 13 16:04:20.696440 kubelet[2822]: I0213 16:04:20.696409 2822 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 16:04:20.697453 kubelet[2822]: I0213 16:04:20.697441 2822 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 16:04:20.699300 kubelet[2822]: E0213 16:04:20.699290 2822 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Feb 13 16:04:20.773292 kubelet[2822]: I0213 16:04:20.773270 2822 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:20.774052 kubelet[2822]: I0213 16:04:20.773961 2822 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" Feb 13 16:04:20.774126 kubelet[2822]: I0213 16:04:20.774104 2822 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:20.800268 kubelet[2822]: I0213 16:04:20.800256 2822 kubelet_node_status.go:76] "Attempting to register node" node="localhost" Feb 13 16:04:20.803855 kubelet[2822]: I0213 16:04:20.803840 2822 kubelet_node_status.go:125] "Node was previously registered" node="localhost" Feb 13 16:04:20.804048 kubelet[2822]: I0213 16:04:20.803946 2822 kubelet_node_status.go:79] "Successfully registered node" node="localhost" Feb 13 16:04:20.868513 kubelet[2822]: I0213 16:04:20.868416 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:20.868513 kubelet[2822]: I0213 16:04:20.868445 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:20.868513 kubelet[2822]: I0213 16:04:20.868459 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:20.868513 kubelet[2822]: I0213 16:04:20.868469 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:20.868513 kubelet[2822]: I0213 16:04:20.868477 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/95ef9ac46cd4dbaadc63cb713310ae59-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"95ef9ac46cd4dbaadc63cb713310ae59\") " pod="kube-system/kube-scheduler-localhost" Feb 13 16:04:20.868695 kubelet[2822]: I0213 16:04:20.868501 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/46fdfbe66f40ad033f38495c1d8bd526-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"46fdfbe66f40ad033f38495c1d8bd526\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:20.868695 kubelet[2822]: I0213 16:04:20.868508 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/46fdfbe66f40ad033f38495c1d8bd526-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"46fdfbe66f40ad033f38495c1d8bd526\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:20.868695 kubelet[2822]: I0213 16:04:20.868520 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/46fdfbe66f40ad033f38495c1d8bd526-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"46fdfbe66f40ad033f38495c1d8bd526\") " pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:20.868695 kubelet[2822]: I0213 16:04:20.868528 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c72911152bbceda2f57fd8d59261e015-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"c72911152bbceda2f57fd8d59261e015\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 16:04:21.654428 kubelet[2822]: I0213 16:04:21.654404 2822 apiserver.go:52] "Watching apiserver" Feb 13 16:04:21.668200 kubelet[2822]: I0213 16:04:21.668181 2822 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 16:04:21.689234 kubelet[2822]: I0213 16:04:21.688657 2822 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:21.726468 kubelet[2822]: E0213 16:04:21.726441 2822 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Feb 13 16:04:21.771241 kubelet[2822]: I0213 16:04:21.771206 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.771190424 podStartE2EDuration="1.771190424s" podCreationTimestamp="2025-02-13 16:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:04:21.756947399 +0000 UTC m=+1.176136079" watchObservedRunningTime="2025-02-13 16:04:21.771190424 +0000 UTC m=+1.190379099" Feb 13 16:04:21.792042 kubelet[2822]: I0213 16:04:21.792007 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.791996253 podStartE2EDuration="1.791996253s" podCreationTimestamp="2025-02-13 16:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:04:21.771736755 +0000 UTC m=+1.190925435" watchObservedRunningTime="2025-02-13 16:04:21.791996253 +0000 UTC m=+1.211184928" Feb 13 16:04:21.796550 kubelet[2822]: I0213 16:04:21.796526 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.796516038 podStartE2EDuration="1.796516038s" podCreationTimestamp="2025-02-13 16:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:04:21.792269909 +0000 UTC m=+1.211458588" watchObservedRunningTime="2025-02-13 16:04:21.796516038 +0000 UTC m=+1.215704709" Feb 13 16:04:24.591944 sudo[1874]: pam_unix(sudo:session): session closed for user root Feb 13 16:04:24.592676 sshd[1873]: Connection closed by 147.75.109.163 port 35430 Feb 13 16:04:24.593922 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Feb 13 16:04:24.595797 systemd[1]: sshd@6-139.178.70.109:22-147.75.109.163:35430.service: Deactivated successfully. Feb 13 16:04:24.597076 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 16:04:24.597244 systemd[1]: session-9.scope: Consumed 3.019s CPU time, 143.1M memory peak. Feb 13 16:04:24.598147 systemd-logind[1538]: Session 9 logged out. Waiting for processes to exit. Feb 13 16:04:24.598759 systemd-logind[1538]: Removed session 9. Feb 13 16:04:24.776392 kubelet[2822]: I0213 16:04:24.776368 2822 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 16:04:24.776919 kubelet[2822]: I0213 16:04:24.776743 2822 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 16:04:24.776949 containerd[1557]: time="2025-02-13T16:04:24.776565405Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 16:04:25.535050 systemd[1]: Created slice kubepods-besteffort-pod6169dbf9_4993_456b_9520_bb00d38d9efc.slice - libcontainer container kubepods-besteffort-pod6169dbf9_4993_456b_9520_bb00d38d9efc.slice. Feb 13 16:04:25.595703 kubelet[2822]: I0213 16:04:25.595682 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9nw\" (UniqueName: \"kubernetes.io/projected/6169dbf9-4993-456b-9520-bb00d38d9efc-kube-api-access-9b9nw\") pod \"kube-proxy-jvhg6\" (UID: \"6169dbf9-4993-456b-9520-bb00d38d9efc\") " pod="kube-system/kube-proxy-jvhg6" Feb 13 16:04:25.595703 kubelet[2822]: I0213 16:04:25.595705 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6169dbf9-4993-456b-9520-bb00d38d9efc-kube-proxy\") pod \"kube-proxy-jvhg6\" (UID: \"6169dbf9-4993-456b-9520-bb00d38d9efc\") " pod="kube-system/kube-proxy-jvhg6" Feb 13 16:04:25.595838 kubelet[2822]: I0213 16:04:25.595718 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6169dbf9-4993-456b-9520-bb00d38d9efc-xtables-lock\") pod \"kube-proxy-jvhg6\" (UID: \"6169dbf9-4993-456b-9520-bb00d38d9efc\") " pod="kube-system/kube-proxy-jvhg6" Feb 13 16:04:25.595838 kubelet[2822]: I0213 16:04:25.595728 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6169dbf9-4993-456b-9520-bb00d38d9efc-lib-modules\") pod \"kube-proxy-jvhg6\" (UID: \"6169dbf9-4993-456b-9520-bb00d38d9efc\") " pod="kube-system/kube-proxy-jvhg6" Feb 13 16:04:25.842441 containerd[1557]: time="2025-02-13T16:04:25.842281840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jvhg6,Uid:6169dbf9-4993-456b-9520-bb00d38d9efc,Namespace:kube-system,Attempt:0,}" Feb 13 16:04:25.848145 kubelet[2822]: I0213 16:04:25.848113 2822 status_manager.go:890] "Failed to get status for pod" podUID="78f5040a-252f-48d7-90d8-79577e8b72ef" pod="tigera-operator/tigera-operator-7d68577dc5-8zmpr" err="pods \"tigera-operator-7d68577dc5-8zmpr\" is forbidden: User \"system:node:localhost\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" Feb 13 16:04:25.849274 kubelet[2822]: W0213 16:04:25.848159 2822 reflector.go:569] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'localhost' and this object Feb 13 16:04:25.849274 kubelet[2822]: E0213 16:04:25.848176 2822 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Feb 13 16:04:25.853900 systemd[1]: Created slice kubepods-besteffort-pod78f5040a_252f_48d7_90d8_79577e8b72ef.slice - libcontainer container kubepods-besteffort-pod78f5040a_252f_48d7_90d8_79577e8b72ef.slice. Feb 13 16:04:25.861879 containerd[1557]: time="2025-02-13T16:04:25.861753427Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:25.861879 containerd[1557]: time="2025-02-13T16:04:25.861788520Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:25.861879 containerd[1557]: time="2025-02-13T16:04:25.861798473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:25.861879 containerd[1557]: time="2025-02-13T16:04:25.861847371Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:25.881753 systemd[1]: Started cri-containerd-fbb1b6fa6e4fd22c4af737ba7c45acbb12715de5ea6140d09bd40b7027aa68fe.scope - libcontainer container fbb1b6fa6e4fd22c4af737ba7c45acbb12715de5ea6140d09bd40b7027aa68fe. Feb 13 16:04:25.896810 containerd[1557]: time="2025-02-13T16:04:25.896789304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jvhg6,Uid:6169dbf9-4993-456b-9520-bb00d38d9efc,Namespace:kube-system,Attempt:0,} returns sandbox id \"fbb1b6fa6e4fd22c4af737ba7c45acbb12715de5ea6140d09bd40b7027aa68fe\"" Feb 13 16:04:25.897673 kubelet[2822]: I0213 16:04:25.897660 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/78f5040a-252f-48d7-90d8-79577e8b72ef-var-lib-calico\") pod \"tigera-operator-7d68577dc5-8zmpr\" (UID: \"78f5040a-252f-48d7-90d8-79577e8b72ef\") " pod="tigera-operator/tigera-operator-7d68577dc5-8zmpr" Feb 13 16:04:25.897755 kubelet[2822]: I0213 16:04:25.897747 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km7p5\" (UniqueName: \"kubernetes.io/projected/78f5040a-252f-48d7-90d8-79577e8b72ef-kube-api-access-km7p5\") pod \"tigera-operator-7d68577dc5-8zmpr\" (UID: \"78f5040a-252f-48d7-90d8-79577e8b72ef\") " pod="tigera-operator/tigera-operator-7d68577dc5-8zmpr" Feb 13 16:04:25.902701 containerd[1557]: time="2025-02-13T16:04:25.902681068Z" level=info msg="CreateContainer within sandbox \"fbb1b6fa6e4fd22c4af737ba7c45acbb12715de5ea6140d09bd40b7027aa68fe\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 16:04:25.911850 containerd[1557]: time="2025-02-13T16:04:25.911828088Z" level=info msg="CreateContainer within sandbox \"fbb1b6fa6e4fd22c4af737ba7c45acbb12715de5ea6140d09bd40b7027aa68fe\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"fce82f7e9ca167cbe60ecef2c4ece179520f1097b29fe7acb82eb4ef758cea43\"" Feb 13 16:04:25.912578 containerd[1557]: time="2025-02-13T16:04:25.912565140Z" level=info msg="StartContainer for \"fce82f7e9ca167cbe60ecef2c4ece179520f1097b29fe7acb82eb4ef758cea43\"" Feb 13 16:04:25.934747 systemd[1]: Started cri-containerd-fce82f7e9ca167cbe60ecef2c4ece179520f1097b29fe7acb82eb4ef758cea43.scope - libcontainer container fce82f7e9ca167cbe60ecef2c4ece179520f1097b29fe7acb82eb4ef758cea43. Feb 13 16:04:25.954803 containerd[1557]: time="2025-02-13T16:04:25.954782375Z" level=info msg="StartContainer for \"fce82f7e9ca167cbe60ecef2c4ece179520f1097b29fe7acb82eb4ef758cea43\" returns successfully" Feb 13 16:04:26.157104 containerd[1557]: time="2025-02-13T16:04:26.157025268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d68577dc5-8zmpr,Uid:78f5040a-252f-48d7-90d8-79577e8b72ef,Namespace:tigera-operator,Attempt:0,}" Feb 13 16:04:26.172351 containerd[1557]: time="2025-02-13T16:04:26.172228040Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:26.172598 containerd[1557]: time="2025-02-13T16:04:26.172451681Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:26.172598 containerd[1557]: time="2025-02-13T16:04:26.172491649Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:26.172728 containerd[1557]: time="2025-02-13T16:04:26.172702214Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:26.189754 systemd[1]: Started cri-containerd-fab1c4a620411ffa9ca8bcb2ca87f984fab6b83b9840e1a7c00dc2432001f751.scope - libcontainer container fab1c4a620411ffa9ca8bcb2ca87f984fab6b83b9840e1a7c00dc2432001f751. Feb 13 16:04:26.221040 containerd[1557]: time="2025-02-13T16:04:26.220994932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7d68577dc5-8zmpr,Uid:78f5040a-252f-48d7-90d8-79577e8b72ef,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"fab1c4a620411ffa9ca8bcb2ca87f984fab6b83b9840e1a7c00dc2432001f751\"" Feb 13 16:04:26.235400 containerd[1557]: time="2025-02-13T16:04:26.235362412Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Feb 13 16:04:26.711293 kubelet[2822]: I0213 16:04:26.710735 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jvhg6" podStartSLOduration=1.7107214229999999 podStartE2EDuration="1.710721423s" podCreationTimestamp="2025-02-13 16:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:04:26.71066032 +0000 UTC m=+6.129848991" watchObservedRunningTime="2025-02-13 16:04:26.710721423 +0000 UTC m=+6.129910104" Feb 13 16:04:27.758313 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1202982767.mount: Deactivated successfully. Feb 13 16:04:28.140690 containerd[1557]: time="2025-02-13T16:04:28.140444809Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Feb 13 16:04:28.142795 containerd[1557]: time="2025-02-13T16:04:28.142772614Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 1.907379674s" Feb 13 16:04:28.142910 containerd[1557]: time="2025-02-13T16:04:28.142855439Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Feb 13 16:04:28.148239 containerd[1557]: time="2025-02-13T16:04:28.148165335Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:28.148705 containerd[1557]: time="2025-02-13T16:04:28.148670810Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:28.149138 containerd[1557]: time="2025-02-13T16:04:28.149124951Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:28.157666 containerd[1557]: time="2025-02-13T16:04:28.157620595Z" level=info msg="CreateContainer within sandbox \"fab1c4a620411ffa9ca8bcb2ca87f984fab6b83b9840e1a7c00dc2432001f751\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 16:04:28.176499 containerd[1557]: time="2025-02-13T16:04:28.176380421Z" level=info msg="CreateContainer within sandbox \"fab1c4a620411ffa9ca8bcb2ca87f984fab6b83b9840e1a7c00dc2432001f751\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"650f93ed12d077dd1bd0704d5f141cadd60f7186489451feb7d8874193956526\"" Feb 13 16:04:28.178539 containerd[1557]: time="2025-02-13T16:04:28.178526025Z" level=info msg="StartContainer for \"650f93ed12d077dd1bd0704d5f141cadd60f7186489451feb7d8874193956526\"" Feb 13 16:04:28.201744 systemd[1]: Started cri-containerd-650f93ed12d077dd1bd0704d5f141cadd60f7186489451feb7d8874193956526.scope - libcontainer container 650f93ed12d077dd1bd0704d5f141cadd60f7186489451feb7d8874193956526. Feb 13 16:04:28.218061 containerd[1557]: time="2025-02-13T16:04:28.218037174Z" level=info msg="StartContainer for \"650f93ed12d077dd1bd0704d5f141cadd60f7186489451feb7d8874193956526\" returns successfully" Feb 13 16:04:28.788665 kubelet[2822]: I0213 16:04:28.788596 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7d68577dc5-8zmpr" podStartSLOduration=1.8607412380000001 podStartE2EDuration="3.786061775s" podCreationTimestamp="2025-02-13 16:04:25 +0000 UTC" firstStartedPulling="2025-02-13 16:04:26.221794747 +0000 UTC m=+5.640983419" lastFinishedPulling="2025-02-13 16:04:28.147115284 +0000 UTC m=+7.566303956" observedRunningTime="2025-02-13 16:04:28.783342966 +0000 UTC m=+8.202531658" watchObservedRunningTime="2025-02-13 16:04:28.786061775 +0000 UTC m=+8.205250449" Feb 13 16:04:30.999780 systemd[1]: Created slice kubepods-besteffort-pod8c73f8dc_8791_4227_a813_78ae1fa7181e.slice - libcontainer container kubepods-besteffort-pod8c73f8dc_8791_4227_a813_78ae1fa7181e.slice. Feb 13 16:04:31.127773 systemd[1]: Created slice kubepods-besteffort-podd34f9a57_2c9d_48c8_817f_e4534510ad9f.slice - libcontainer container kubepods-besteffort-podd34f9a57_2c9d_48c8_817f_e4534510ad9f.slice. Feb 13 16:04:31.154195 kubelet[2822]: I0213 16:04:31.154109 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8c73f8dc-8791-4227-a813-78ae1fa7181e-tigera-ca-bundle\") pod \"calico-typha-7874d98995-wkrtc\" (UID: \"8c73f8dc-8791-4227-a813-78ae1fa7181e\") " pod="calico-system/calico-typha-7874d98995-wkrtc" Feb 13 16:04:31.154195 kubelet[2822]: I0213 16:04:31.154139 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/8c73f8dc-8791-4227-a813-78ae1fa7181e-typha-certs\") pod \"calico-typha-7874d98995-wkrtc\" (UID: \"8c73f8dc-8791-4227-a813-78ae1fa7181e\") " pod="calico-system/calico-typha-7874d98995-wkrtc" Feb 13 16:04:31.154195 kubelet[2822]: I0213 16:04:31.154151 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv2pr\" (UniqueName: \"kubernetes.io/projected/8c73f8dc-8791-4227-a813-78ae1fa7181e-kube-api-access-jv2pr\") pod \"calico-typha-7874d98995-wkrtc\" (UID: \"8c73f8dc-8791-4227-a813-78ae1fa7181e\") " pod="calico-system/calico-typha-7874d98995-wkrtc" Feb 13 16:04:31.224113 kubelet[2822]: E0213 16:04:31.223672 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:31.255260 kubelet[2822]: I0213 16:04:31.255166 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-lib-modules\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.255260 kubelet[2822]: I0213 16:04:31.255190 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-cni-net-dir\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.255260 kubelet[2822]: I0213 16:04:31.255201 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4e58e6b2-6097-4f35-ba27-10ac9fc1ce49-socket-dir\") pod \"csi-node-driver-tndq9\" (UID: \"4e58e6b2-6097-4f35-ba27-10ac9fc1ce49\") " pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:31.255260 kubelet[2822]: I0213 16:04:31.255212 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxp65\" (UniqueName: \"kubernetes.io/projected/4e58e6b2-6097-4f35-ba27-10ac9fc1ce49-kube-api-access-kxp65\") pod \"csi-node-driver-tndq9\" (UID: \"4e58e6b2-6097-4f35-ba27-10ac9fc1ce49\") " pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:31.255260 kubelet[2822]: I0213 16:04:31.255234 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-xtables-lock\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.255411 kubelet[2822]: I0213 16:04:31.255245 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-cni-log-dir\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.255411 kubelet[2822]: I0213 16:04:31.255253 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4e58e6b2-6097-4f35-ba27-10ac9fc1ce49-kubelet-dir\") pod \"csi-node-driver-tndq9\" (UID: \"4e58e6b2-6097-4f35-ba27-10ac9fc1ce49\") " pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:31.255411 kubelet[2822]: I0213 16:04:31.255263 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-cni-bin-dir\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.255411 kubelet[2822]: I0213 16:04:31.255271 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4e58e6b2-6097-4f35-ba27-10ac9fc1ce49-varrun\") pod \"csi-node-driver-tndq9\" (UID: \"4e58e6b2-6097-4f35-ba27-10ac9fc1ce49\") " pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:31.255411 kubelet[2822]: I0213 16:04:31.255286 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxkkz\" (UniqueName: \"kubernetes.io/projected/d34f9a57-2c9d-48c8-817f-e4534510ad9f-kube-api-access-zxkkz\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.261531 kubelet[2822]: I0213 16:04:31.255294 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4e58e6b2-6097-4f35-ba27-10ac9fc1ce49-registration-dir\") pod \"csi-node-driver-tndq9\" (UID: \"4e58e6b2-6097-4f35-ba27-10ac9fc1ce49\") " pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:31.261531 kubelet[2822]: I0213 16:04:31.255304 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d34f9a57-2c9d-48c8-817f-e4534510ad9f-tigera-ca-bundle\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.261531 kubelet[2822]: I0213 16:04:31.255312 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-var-lib-calico\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.261531 kubelet[2822]: I0213 16:04:31.255320 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-flexvol-driver-host\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.261531 kubelet[2822]: I0213 16:04:31.255329 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-policysync\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.261624 kubelet[2822]: I0213 16:04:31.255339 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d34f9a57-2c9d-48c8-817f-e4534510ad9f-node-certs\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.261624 kubelet[2822]: I0213 16:04:31.255349 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d34f9a57-2c9d-48c8-817f-e4534510ad9f-var-run-calico\") pod \"calico-node-xhs2q\" (UID: \"d34f9a57-2c9d-48c8-817f-e4534510ad9f\") " pod="calico-system/calico-node-xhs2q" Feb 13 16:04:31.304676 containerd[1557]: time="2025-02-13T16:04:31.304561340Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7874d98995-wkrtc,Uid:8c73f8dc-8791-4227-a813-78ae1fa7181e,Namespace:calico-system,Attempt:0,}" Feb 13 16:04:31.326462 containerd[1557]: time="2025-02-13T16:04:31.326392376Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:31.326462 containerd[1557]: time="2025-02-13T16:04:31.326440300Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:31.326462 containerd[1557]: time="2025-02-13T16:04:31.326449120Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:31.327095 containerd[1557]: time="2025-02-13T16:04:31.326981670Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:31.344738 systemd[1]: Started cri-containerd-7f88a5d5d3f924b79aa7a3de4c5068d9a2e86c808a7638ed46398b2f6b427ba9.scope - libcontainer container 7f88a5d5d3f924b79aa7a3de4c5068d9a2e86c808a7638ed46398b2f6b427ba9. Feb 13 16:04:31.359442 kubelet[2822]: E0213 16:04:31.359314 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.359442 kubelet[2822]: W0213 16:04:31.359328 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.359442 kubelet[2822]: E0213 16:04:31.359350 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.359776 kubelet[2822]: E0213 16:04:31.359570 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.359776 kubelet[2822]: W0213 16:04:31.359576 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.359776 kubelet[2822]: E0213 16:04:31.359589 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.359874 kubelet[2822]: E0213 16:04:31.359869 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.359908 kubelet[2822]: W0213 16:04:31.359903 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.359974 kubelet[2822]: E0213 16:04:31.359937 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.360253 kubelet[2822]: E0213 16:04:31.360212 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.360253 kubelet[2822]: W0213 16:04:31.360218 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.360412 kubelet[2822]: E0213 16:04:31.360313 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.360504 kubelet[2822]: E0213 16:04:31.360499 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.360604 kubelet[2822]: W0213 16:04:31.360597 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.361777 kubelet[2822]: E0213 16:04:31.361708 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.361777 kubelet[2822]: W0213 16:04:31.361718 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.362141 kubelet[2822]: E0213 16:04:31.361929 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.362141 kubelet[2822]: W0213 16:04:31.361934 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.362141 kubelet[2822]: E0213 16:04:31.361941 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.362141 kubelet[2822]: E0213 16:04:31.361954 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.362141 kubelet[2822]: E0213 16:04:31.362049 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.362141 kubelet[2822]: W0213 16:04:31.362054 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.362141 kubelet[2822]: E0213 16:04:31.362059 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.362649 kubelet[2822]: E0213 16:04:31.362267 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.362649 kubelet[2822]: E0213 16:04:31.362330 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.362649 kubelet[2822]: W0213 16:04:31.362335 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.362649 kubelet[2822]: E0213 16:04:31.362343 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.362878 kubelet[2822]: E0213 16:04:31.362867 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.362878 kubelet[2822]: W0213 16:04:31.362875 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.363292 kubelet[2822]: E0213 16:04:31.362883 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.363292 kubelet[2822]: E0213 16:04:31.363169 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.363292 kubelet[2822]: W0213 16:04:31.363175 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.363292 kubelet[2822]: E0213 16:04:31.363180 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.363546 kubelet[2822]: E0213 16:04:31.363452 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.363546 kubelet[2822]: W0213 16:04:31.363460 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.363546 kubelet[2822]: E0213 16:04:31.363467 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.363743 kubelet[2822]: E0213 16:04:31.363629 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.363743 kubelet[2822]: W0213 16:04:31.363636 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.363743 kubelet[2822]: E0213 16:04:31.363684 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.364165 kubelet[2822]: E0213 16:04:31.364130 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.364165 kubelet[2822]: W0213 16:04:31.364138 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.364165 kubelet[2822]: E0213 16:04:31.364144 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.366157 kubelet[2822]: E0213 16:04:31.366147 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:31.366231 kubelet[2822]: W0213 16:04:31.366208 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:31.366231 kubelet[2822]: E0213 16:04:31.366219 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:31.383833 containerd[1557]: time="2025-02-13T16:04:31.383742949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7874d98995-wkrtc,Uid:8c73f8dc-8791-4227-a813-78ae1fa7181e,Namespace:calico-system,Attempt:0,} returns sandbox id \"7f88a5d5d3f924b79aa7a3de4c5068d9a2e86c808a7638ed46398b2f6b427ba9\"" Feb 13 16:04:31.384998 containerd[1557]: time="2025-02-13T16:04:31.384882343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Feb 13 16:04:31.430971 containerd[1557]: time="2025-02-13T16:04:31.430944993Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xhs2q,Uid:d34f9a57-2c9d-48c8-817f-e4534510ad9f,Namespace:calico-system,Attempt:0,}" Feb 13 16:04:31.442115 containerd[1557]: time="2025-02-13T16:04:31.442042987Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:31.442115 containerd[1557]: time="2025-02-13T16:04:31.442083834Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:31.442115 containerd[1557]: time="2025-02-13T16:04:31.442101891Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:31.442360 containerd[1557]: time="2025-02-13T16:04:31.442156425Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:31.457783 systemd[1]: Started cri-containerd-373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac.scope - libcontainer container 373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac. Feb 13 16:04:31.472894 containerd[1557]: time="2025-02-13T16:04:31.472795052Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xhs2q,Uid:d34f9a57-2c9d-48c8-817f-e4534510ad9f,Namespace:calico-system,Attempt:0,} returns sandbox id \"373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac\"" Feb 13 16:04:32.765041 kubelet[2822]: E0213 16:04:32.765028 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:32.765364 kubelet[2822]: W0213 16:04:32.765287 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:32.765364 kubelet[2822]: E0213 16:04:32.765301 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:32.765544 kubelet[2822]: E0213 16:04:32.765411 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:32.765544 kubelet[2822]: W0213 16:04:32.765416 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:32.765544 kubelet[2822]: E0213 16:04:32.765422 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:32.765741 kubelet[2822]: E0213 16:04:32.765623 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:32.765741 kubelet[2822]: W0213 16:04:32.765629 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:32.765741 kubelet[2822]: E0213 16:04:32.765634 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:32.765879 kubelet[2822]: E0213 16:04:32.765822 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:32.765879 kubelet[2822]: W0213 16:04:32.765828 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:32.765879 kubelet[2822]: E0213 16:04:32.765833 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:32.765968 kubelet[2822]: E0213 16:04:32.765962 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:32.766035 kubelet[2822]: W0213 16:04:32.765995 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:32.766035 kubelet[2822]: E0213 16:04:32.766004 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.013268 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2244532032.mount: Deactivated successfully. Feb 13 16:04:33.489537 containerd[1557]: time="2025-02-13T16:04:33.489513304Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:33.490094 containerd[1557]: time="2025-02-13T16:04:33.490060848Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Feb 13 16:04:33.490667 containerd[1557]: time="2025-02-13T16:04:33.490353621Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:33.492652 containerd[1557]: time="2025-02-13T16:04:33.491581019Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:33.492958 containerd[1557]: time="2025-02-13T16:04:33.492946400Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.108048289s" Feb 13 16:04:33.493011 containerd[1557]: time="2025-02-13T16:04:33.493002296Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Feb 13 16:04:33.495267 containerd[1557]: time="2025-02-13T16:04:33.495255871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 16:04:33.503471 containerd[1557]: time="2025-02-13T16:04:33.503446254Z" level=info msg="CreateContainer within sandbox \"7f88a5d5d3f924b79aa7a3de4c5068d9a2e86c808a7638ed46398b2f6b427ba9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 16:04:33.509646 containerd[1557]: time="2025-02-13T16:04:33.509618608Z" level=info msg="CreateContainer within sandbox \"7f88a5d5d3f924b79aa7a3de4c5068d9a2e86c808a7638ed46398b2f6b427ba9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"ec15cfc7139476fdc7f7df206c597977bb4d7dd750dca2e782f1db9ccef1bc78\"" Feb 13 16:04:33.510018 containerd[1557]: time="2025-02-13T16:04:33.509993671Z" level=info msg="StartContainer for \"ec15cfc7139476fdc7f7df206c597977bb4d7dd750dca2e782f1db9ccef1bc78\"" Feb 13 16:04:33.549742 systemd[1]: Started cri-containerd-ec15cfc7139476fdc7f7df206c597977bb4d7dd750dca2e782f1db9ccef1bc78.scope - libcontainer container ec15cfc7139476fdc7f7df206c597977bb4d7dd750dca2e782f1db9ccef1bc78. Feb 13 16:04:33.575688 containerd[1557]: time="2025-02-13T16:04:33.575554628Z" level=info msg="StartContainer for \"ec15cfc7139476fdc7f7df206c597977bb4d7dd750dca2e782f1db9ccef1bc78\" returns successfully" Feb 13 16:04:33.672101 kubelet[2822]: E0213 16:04:33.672049 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:33.769083 kubelet[2822]: I0213 16:04:33.769007 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7874d98995-wkrtc" podStartSLOduration=1.659790986 podStartE2EDuration="3.768998258s" podCreationTimestamp="2025-02-13 16:04:30 +0000 UTC" firstStartedPulling="2025-02-13 16:04:31.384621052 +0000 UTC m=+10.803809724" lastFinishedPulling="2025-02-13 16:04:33.493828324 +0000 UTC m=+12.913016996" observedRunningTime="2025-02-13 16:04:33.768562714 +0000 UTC m=+13.187751395" watchObservedRunningTime="2025-02-13 16:04:33.768998258 +0000 UTC m=+13.188186933" Feb 13 16:04:33.772334 kubelet[2822]: E0213 16:04:33.772319 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.772334 kubelet[2822]: W0213 16:04:33.772331 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.772400 kubelet[2822]: E0213 16:04:33.772343 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.772436 kubelet[2822]: E0213 16:04:33.772432 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.772457 kubelet[2822]: W0213 16:04:33.772437 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.772457 kubelet[2822]: E0213 16:04:33.772442 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.772534 kubelet[2822]: E0213 16:04:33.772525 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.772534 kubelet[2822]: W0213 16:04:33.772532 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.772580 kubelet[2822]: E0213 16:04:33.772537 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.772661 kubelet[2822]: E0213 16:04:33.772650 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.772661 kubelet[2822]: W0213 16:04:33.772657 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.772727 kubelet[2822]: E0213 16:04:33.772664 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.772759 kubelet[2822]: E0213 16:04:33.772750 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.772759 kubelet[2822]: W0213 16:04:33.772754 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.772803 kubelet[2822]: E0213 16:04:33.772759 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.772846 kubelet[2822]: E0213 16:04:33.772834 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.772846 kubelet[2822]: W0213 16:04:33.772842 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.772890 kubelet[2822]: E0213 16:04:33.772847 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.772925 kubelet[2822]: E0213 16:04:33.772922 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.772947 kubelet[2822]: W0213 16:04:33.772926 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.772947 kubelet[2822]: E0213 16:04:33.772930 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.773017 kubelet[2822]: E0213 16:04:33.773008 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.773046 kubelet[2822]: W0213 16:04:33.773018 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.773046 kubelet[2822]: E0213 16:04:33.773024 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.773119 kubelet[2822]: E0213 16:04:33.773110 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.773119 kubelet[2822]: W0213 16:04:33.773117 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.773242 kubelet[2822]: E0213 16:04:33.773124 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.773242 kubelet[2822]: E0213 16:04:33.773208 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.773242 kubelet[2822]: W0213 16:04:33.773212 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.773242 kubelet[2822]: E0213 16:04:33.773217 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.773325 kubelet[2822]: E0213 16:04:33.773291 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.773325 kubelet[2822]: W0213 16:04:33.773296 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.773325 kubelet[2822]: E0213 16:04:33.773300 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.773387 kubelet[2822]: E0213 16:04:33.773376 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.773387 kubelet[2822]: W0213 16:04:33.773380 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.773387 kubelet[2822]: E0213 16:04:33.773384 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.773473 kubelet[2822]: E0213 16:04:33.773462 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.773473 kubelet[2822]: W0213 16:04:33.773467 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.773473 kubelet[2822]: E0213 16:04:33.773471 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.773555 kubelet[2822]: E0213 16:04:33.773542 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.773555 kubelet[2822]: W0213 16:04:33.773548 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.773555 kubelet[2822]: E0213 16:04:33.773552 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774158 kubelet[2822]: E0213 16:04:33.773654 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774158 kubelet[2822]: W0213 16:04:33.773659 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774158 kubelet[2822]: E0213 16:04:33.773664 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774158 kubelet[2822]: E0213 16:04:33.773777 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774158 kubelet[2822]: W0213 16:04:33.773782 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774158 kubelet[2822]: E0213 16:04:33.773787 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774158 kubelet[2822]: E0213 16:04:33.773906 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774158 kubelet[2822]: W0213 16:04:33.773910 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774158 kubelet[2822]: E0213 16:04:33.773917 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774158 kubelet[2822]: E0213 16:04:33.774023 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774442 kubelet[2822]: W0213 16:04:33.774028 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774442 kubelet[2822]: E0213 16:04:33.774037 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774442 kubelet[2822]: E0213 16:04:33.774138 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774442 kubelet[2822]: W0213 16:04:33.774143 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774442 kubelet[2822]: E0213 16:04:33.774149 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774442 kubelet[2822]: E0213 16:04:33.774230 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774442 kubelet[2822]: W0213 16:04:33.774234 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774442 kubelet[2822]: E0213 16:04:33.774241 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774691 kubelet[2822]: E0213 16:04:33.774604 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774691 kubelet[2822]: W0213 16:04:33.774612 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774691 kubelet[2822]: E0213 16:04:33.774625 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774888 kubelet[2822]: E0213 16:04:33.774749 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774888 kubelet[2822]: W0213 16:04:33.774754 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.774888 kubelet[2822]: E0213 16:04:33.774764 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.774888 kubelet[2822]: E0213 16:04:33.774866 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.774888 kubelet[2822]: W0213 16:04:33.774871 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.775050 kubelet[2822]: E0213 16:04:33.774980 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.775149 kubelet[2822]: E0213 16:04:33.775097 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.775149 kubelet[2822]: W0213 16:04:33.775102 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.775149 kubelet[2822]: E0213 16:04:33.775117 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.775295 kubelet[2822]: E0213 16:04:33.775266 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.775295 kubelet[2822]: W0213 16:04:33.775271 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.775295 kubelet[2822]: E0213 16:04:33.775287 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.775492 kubelet[2822]: E0213 16:04:33.775442 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.775492 kubelet[2822]: W0213 16:04:33.775447 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.775492 kubelet[2822]: E0213 16:04:33.775465 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.775656 kubelet[2822]: E0213 16:04:33.775585 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.775656 kubelet[2822]: W0213 16:04:33.775590 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.775656 kubelet[2822]: E0213 16:04:33.775597 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.775786 kubelet[2822]: E0213 16:04:33.775752 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.775786 kubelet[2822]: W0213 16:04:33.775758 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.775786 kubelet[2822]: E0213 16:04:33.775767 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.775976 kubelet[2822]: E0213 16:04:33.775928 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.775976 kubelet[2822]: W0213 16:04:33.775934 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.775976 kubelet[2822]: E0213 16:04:33.775944 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.776170 kubelet[2822]: E0213 16:04:33.776122 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.776170 kubelet[2822]: W0213 16:04:33.776127 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.776170 kubelet[2822]: E0213 16:04:33.776137 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.776331 kubelet[2822]: E0213 16:04:33.776226 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.776331 kubelet[2822]: W0213 16:04:33.776231 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.776331 kubelet[2822]: E0213 16:04:33.776240 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.776454 kubelet[2822]: E0213 16:04:33.776448 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.776490 kubelet[2822]: W0213 16:04:33.776484 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.776572 kubelet[2822]: E0213 16:04:33.776527 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:33.776637 kubelet[2822]: E0213 16:04:33.776632 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:33.776692 kubelet[2822]: W0213 16:04:33.776675 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:33.776692 kubelet[2822]: E0213 16:04:33.776683 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.765354 kubelet[2822]: I0213 16:04:34.765336 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:04:34.778634 kubelet[2822]: E0213 16:04:34.778574 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.778634 kubelet[2822]: W0213 16:04:34.778583 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.778634 kubelet[2822]: E0213 16:04:34.778591 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779063 kubelet[2822]: E0213 16:04:34.778700 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779063 kubelet[2822]: W0213 16:04:34.778705 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779063 kubelet[2822]: E0213 16:04:34.778725 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779063 kubelet[2822]: E0213 16:04:34.778821 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779063 kubelet[2822]: W0213 16:04:34.778825 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779063 kubelet[2822]: E0213 16:04:34.778831 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779063 kubelet[2822]: E0213 16:04:34.778917 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779063 kubelet[2822]: W0213 16:04:34.778922 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779063 kubelet[2822]: E0213 16:04:34.778927 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779063 kubelet[2822]: E0213 16:04:34.779021 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779475 kubelet[2822]: W0213 16:04:34.779025 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779475 kubelet[2822]: E0213 16:04:34.779031 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779475 kubelet[2822]: E0213 16:04:34.779117 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779475 kubelet[2822]: W0213 16:04:34.779121 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779475 kubelet[2822]: E0213 16:04:34.779152 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779475 kubelet[2822]: E0213 16:04:34.779252 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779475 kubelet[2822]: W0213 16:04:34.779257 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779475 kubelet[2822]: E0213 16:04:34.779261 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779475 kubelet[2822]: E0213 16:04:34.779335 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779475 kubelet[2822]: W0213 16:04:34.779339 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779782 kubelet[2822]: E0213 16:04:34.779343 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779782 kubelet[2822]: E0213 16:04:34.779433 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779782 kubelet[2822]: W0213 16:04:34.779439 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779782 kubelet[2822]: E0213 16:04:34.779448 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779782 kubelet[2822]: E0213 16:04:34.779533 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779782 kubelet[2822]: W0213 16:04:34.779537 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779782 kubelet[2822]: E0213 16:04:34.779541 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.779782 kubelet[2822]: E0213 16:04:34.779616 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.779782 kubelet[2822]: W0213 16:04:34.779620 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.779782 kubelet[2822]: E0213 16:04:34.779624 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780163 kubelet[2822]: E0213 16:04:34.779708 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780163 kubelet[2822]: W0213 16:04:34.779712 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780163 kubelet[2822]: E0213 16:04:34.779717 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780163 kubelet[2822]: E0213 16:04:34.779794 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780163 kubelet[2822]: W0213 16:04:34.779798 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780163 kubelet[2822]: E0213 16:04:34.779802 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780163 kubelet[2822]: E0213 16:04:34.779873 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780163 kubelet[2822]: W0213 16:04:34.779877 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780163 kubelet[2822]: E0213 16:04:34.779881 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780163 kubelet[2822]: E0213 16:04:34.779950 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780506 kubelet[2822]: W0213 16:04:34.779954 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780506 kubelet[2822]: E0213 16:04:34.779959 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780506 kubelet[2822]: E0213 16:04:34.780063 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780506 kubelet[2822]: W0213 16:04:34.780067 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780506 kubelet[2822]: E0213 16:04:34.780075 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780506 kubelet[2822]: E0213 16:04:34.780151 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780506 kubelet[2822]: W0213 16:04:34.780155 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780506 kubelet[2822]: E0213 16:04:34.780165 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780506 kubelet[2822]: E0213 16:04:34.780248 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780506 kubelet[2822]: W0213 16:04:34.780252 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780840 kubelet[2822]: E0213 16:04:34.780258 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780840 kubelet[2822]: E0213 16:04:34.780332 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780840 kubelet[2822]: W0213 16:04:34.780336 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780840 kubelet[2822]: E0213 16:04:34.780340 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780840 kubelet[2822]: E0213 16:04:34.780428 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780840 kubelet[2822]: W0213 16:04:34.780433 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780840 kubelet[2822]: E0213 16:04:34.780437 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.780840 kubelet[2822]: E0213 16:04:34.780507 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.780840 kubelet[2822]: W0213 16:04:34.780511 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.780840 kubelet[2822]: E0213 16:04:34.780516 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.781537 kubelet[2822]: E0213 16:04:34.780583 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.781537 kubelet[2822]: W0213 16:04:34.780587 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.781537 kubelet[2822]: E0213 16:04:34.780592 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.781537 kubelet[2822]: E0213 16:04:34.780674 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.781537 kubelet[2822]: W0213 16:04:34.780678 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.781537 kubelet[2822]: E0213 16:04:34.780688 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.781537 kubelet[2822]: E0213 16:04:34.780776 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.781537 kubelet[2822]: W0213 16:04:34.780780 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.781537 kubelet[2822]: E0213 16:04:34.780786 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.781537 kubelet[2822]: E0213 16:04:34.780862 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.782616 kubelet[2822]: W0213 16:04:34.780866 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.782616 kubelet[2822]: E0213 16:04:34.780873 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.782616 kubelet[2822]: E0213 16:04:34.780951 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.782616 kubelet[2822]: W0213 16:04:34.780955 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.782616 kubelet[2822]: E0213 16:04:34.780962 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.782616 kubelet[2822]: E0213 16:04:34.781040 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.782616 kubelet[2822]: W0213 16:04:34.781044 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.782616 kubelet[2822]: E0213 16:04:34.781051 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.782616 kubelet[2822]: E0213 16:04:34.781125 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.782616 kubelet[2822]: W0213 16:04:34.781129 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783057 kubelet[2822]: E0213 16:04:34.781134 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783057 kubelet[2822]: E0213 16:04:34.781202 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783057 kubelet[2822]: W0213 16:04:34.781206 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783057 kubelet[2822]: E0213 16:04:34.781251 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783057 kubelet[2822]: E0213 16:04:34.781343 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783057 kubelet[2822]: W0213 16:04:34.781348 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783057 kubelet[2822]: E0213 16:04:34.781354 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783057 kubelet[2822]: E0213 16:04:34.781677 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783057 kubelet[2822]: W0213 16:04:34.781685 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783057 kubelet[2822]: E0213 16:04:34.781698 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783300 kubelet[2822]: E0213 16:04:34.781860 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783300 kubelet[2822]: W0213 16:04:34.781866 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783300 kubelet[2822]: E0213 16:04:34.781878 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783300 kubelet[2822]: E0213 16:04:34.782002 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783300 kubelet[2822]: W0213 16:04:34.782008 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783300 kubelet[2822]: E0213 16:04:34.782017 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783300 kubelet[2822]: E0213 16:04:34.782334 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783300 kubelet[2822]: W0213 16:04:34.782342 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783300 kubelet[2822]: E0213 16:04:34.782354 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783300 kubelet[2822]: E0213 16:04:34.782681 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783538 kubelet[2822]: W0213 16:04:34.782687 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783538 kubelet[2822]: E0213 16:04:34.782703 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783538 kubelet[2822]: E0213 16:04:34.782802 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783839 kubelet[2822]: W0213 16:04:34.783619 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783839 kubelet[2822]: E0213 16:04:34.783656 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.783839 kubelet[2822]: E0213 16:04:34.783774 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.783839 kubelet[2822]: W0213 16:04:34.783780 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.783839 kubelet[2822]: E0213 16:04:34.783796 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.784486 kubelet[2822]: E0213 16:04:34.783882 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.784486 kubelet[2822]: W0213 16:04:34.783888 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.784486 kubelet[2822]: E0213 16:04:34.783902 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.784486 kubelet[2822]: E0213 16:04:34.783996 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.784486 kubelet[2822]: W0213 16:04:34.784002 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.784486 kubelet[2822]: E0213 16:04:34.784013 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.784486 kubelet[2822]: E0213 16:04:34.784121 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.784486 kubelet[2822]: W0213 16:04:34.784127 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.784486 kubelet[2822]: E0213 16:04:34.784138 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.784486 kubelet[2822]: E0213 16:04:34.784270 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785086 kubelet[2822]: W0213 16:04:34.784276 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785086 kubelet[2822]: E0213 16:04:34.784287 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.785086 kubelet[2822]: E0213 16:04:34.784451 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785086 kubelet[2822]: W0213 16:04:34.784456 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785086 kubelet[2822]: E0213 16:04:34.784469 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.785086 kubelet[2822]: E0213 16:04:34.784551 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785086 kubelet[2822]: W0213 16:04:34.784556 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785086 kubelet[2822]: E0213 16:04:34.784563 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.785086 kubelet[2822]: E0213 16:04:34.784669 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785086 kubelet[2822]: W0213 16:04:34.784674 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785330 kubelet[2822]: E0213 16:04:34.784681 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.785330 kubelet[2822]: E0213 16:04:34.784856 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785330 kubelet[2822]: W0213 16:04:34.784862 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785330 kubelet[2822]: E0213 16:04:34.784874 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.785330 kubelet[2822]: E0213 16:04:34.785012 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785330 kubelet[2822]: W0213 16:04:34.785016 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785330 kubelet[2822]: E0213 16:04:34.785022 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.785330 kubelet[2822]: E0213 16:04:34.785102 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785330 kubelet[2822]: W0213 16:04:34.785106 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785330 kubelet[2822]: E0213 16:04:34.785117 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:34.785567 kubelet[2822]: E0213 16:04:34.785195 2822 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 16:04:34.785567 kubelet[2822]: W0213 16:04:34.785199 2822 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 16:04:34.785567 kubelet[2822]: E0213 16:04:34.785204 2822 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 16:04:35.146480 containerd[1557]: time="2025-02-13T16:04:35.146443335Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:35.146916 containerd[1557]: time="2025-02-13T16:04:35.146899830Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Feb 13 16:04:35.147486 containerd[1557]: time="2025-02-13T16:04:35.147187063Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:35.148180 containerd[1557]: time="2025-02-13T16:04:35.148159880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:35.148813 containerd[1557]: time="2025-02-13T16:04:35.148561490Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.653237681s" Feb 13 16:04:35.148813 containerd[1557]: time="2025-02-13T16:04:35.148579918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 16:04:35.150129 containerd[1557]: time="2025-02-13T16:04:35.150117225Z" level=info msg="CreateContainer within sandbox \"373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 16:04:35.156003 containerd[1557]: time="2025-02-13T16:04:35.155981290Z" level=info msg="CreateContainer within sandbox \"373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165\"" Feb 13 16:04:35.158180 containerd[1557]: time="2025-02-13T16:04:35.156999749Z" level=info msg="StartContainer for \"5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165\"" Feb 13 16:04:35.177996 systemd[1]: Started cri-containerd-5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165.scope - libcontainer container 5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165. Feb 13 16:04:35.195089 containerd[1557]: time="2025-02-13T16:04:35.195068129Z" level=info msg="StartContainer for \"5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165\" returns successfully" Feb 13 16:04:35.202870 systemd[1]: cri-containerd-5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165.scope: Deactivated successfully. Feb 13 16:04:35.223966 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165-rootfs.mount: Deactivated successfully. Feb 13 16:04:35.500083 containerd[1557]: time="2025-02-13T16:04:35.499978675Z" level=info msg="shim disconnected" id=5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165 namespace=k8s.io Feb 13 16:04:35.500354 containerd[1557]: time="2025-02-13T16:04:35.500222212Z" level=warning msg="cleaning up after shim disconnected" id=5ba6feb2ce187ddb6435d14fe1aa67dbd9ae387cad2095a996cbfa81f6e93165 namespace=k8s.io Feb 13 16:04:35.500354 containerd[1557]: time="2025-02-13T16:04:35.500236910Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:04:35.672434 kubelet[2822]: E0213 16:04:35.672340 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:35.771508 containerd[1557]: time="2025-02-13T16:04:35.771432669Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 16:04:37.671819 kubelet[2822]: E0213 16:04:37.671789 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:39.065223 containerd[1557]: time="2025-02-13T16:04:39.065110710Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:39.065958 containerd[1557]: time="2025-02-13T16:04:39.065759585Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 16:04:39.066452 containerd[1557]: time="2025-02-13T16:04:39.066202960Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:39.068060 containerd[1557]: time="2025-02-13T16:04:39.068009853Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:39.068580 containerd[1557]: time="2025-02-13T16:04:39.068558055Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 3.297102096s" Feb 13 16:04:39.068580 containerd[1557]: time="2025-02-13T16:04:39.068578554Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 16:04:39.071444 containerd[1557]: time="2025-02-13T16:04:39.071420646Z" level=info msg="CreateContainer within sandbox \"373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 16:04:39.090538 containerd[1557]: time="2025-02-13T16:04:39.090470491Z" level=info msg="CreateContainer within sandbox \"373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8\"" Feb 13 16:04:39.091605 containerd[1557]: time="2025-02-13T16:04:39.090831888Z" level=info msg="StartContainer for \"557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8\"" Feb 13 16:04:39.131730 systemd[1]: Started cri-containerd-557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8.scope - libcontainer container 557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8. Feb 13 16:04:39.149371 containerd[1557]: time="2025-02-13T16:04:39.149347427Z" level=info msg="StartContainer for \"557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8\" returns successfully" Feb 13 16:04:39.672043 kubelet[2822]: E0213 16:04:39.672009 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:40.966229 systemd[1]: cri-containerd-557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8.scope: Deactivated successfully. Feb 13 16:04:40.966447 systemd[1]: cri-containerd-557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8.scope: Consumed 265ms CPU time, 148.5M memory peak, 12K read from disk, 151M written to disk. Feb 13 16:04:40.993891 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8-rootfs.mount: Deactivated successfully. Feb 13 16:04:41.049812 kubelet[2822]: I0213 16:04:41.049798 2822 kubelet_node_status.go:502] "Fast updating node status as it just became ready" Feb 13 16:04:41.065212 containerd[1557]: time="2025-02-13T16:04:41.065175673Z" level=info msg="shim disconnected" id=557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8 namespace=k8s.io Feb 13 16:04:41.065605 containerd[1557]: time="2025-02-13T16:04:41.065593250Z" level=warning msg="cleaning up after shim disconnected" id=557e772ae16f3e2dc4cf1a63a28bc29ab6f8637d40548e65b1c0d424d210e5d8 namespace=k8s.io Feb 13 16:04:41.066745 containerd[1557]: time="2025-02-13T16:04:41.065660507Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 16:04:41.075727 kubelet[2822]: W0213 16:04:41.075703 2822 reflector.go:569] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'localhost' and this object Feb 13 16:04:41.076299 kubelet[2822]: I0213 16:04:41.076218 2822 status_manager.go:890] "Failed to get status for pod" podUID="7dfb39fc-a74d-477a-9521-d86660f1b99b" pod="kube-system/coredns-668d6bf9bc-znr52" err="pods \"coredns-668d6bf9bc-znr52\" is forbidden: User \"system:node:localhost\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'localhost' and this object" Feb 13 16:04:41.078957 kubelet[2822]: E0213 16:04:41.078932 2822 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:localhost\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'localhost' and this object" logger="UnhandledError" Feb 13 16:04:41.082468 containerd[1557]: time="2025-02-13T16:04:41.082435840Z" level=warning msg="cleanup warnings time=\"2025-02-13T16:04:41Z\" level=warning msg=\"failed to remove runc container\" error=\"runc did not terminate successfully: exit status 255: \" runtime=io.containerd.runc.v2\n" namespace=k8s.io Feb 13 16:04:41.087284 systemd[1]: Created slice kubepods-burstable-pod7dfb39fc_a74d_477a_9521_d86660f1b99b.slice - libcontainer container kubepods-burstable-pod7dfb39fc_a74d_477a_9521_d86660f1b99b.slice. Feb 13 16:04:41.096034 systemd[1]: Created slice kubepods-besteffort-podf8381855_f1fc_4a49_b9b0_17a3731a7463.slice - libcontainer container kubepods-besteffort-podf8381855_f1fc_4a49_b9b0_17a3731a7463.slice. Feb 13 16:04:41.103133 systemd[1]: Created slice kubepods-burstable-pod4ece0321_0375_43da_8c1b_713662433b6a.slice - libcontainer container kubepods-burstable-pod4ece0321_0375_43da_8c1b_713662433b6a.slice. Feb 13 16:04:41.111134 systemd[1]: Created slice kubepods-besteffort-pod3d909c6e_816b_4c0f_b575_428d389c17b0.slice - libcontainer container kubepods-besteffort-pod3d909c6e_816b_4c0f_b575_428d389c17b0.slice. Feb 13 16:04:41.117801 systemd[1]: Created slice kubepods-besteffort-pod5eb4ecb4_cc42_4d1d_ab4a_ce503a56daa5.slice - libcontainer container kubepods-besteffort-pod5eb4ecb4_cc42_4d1d_ab4a_ce503a56daa5.slice. Feb 13 16:04:41.122036 kubelet[2822]: I0213 16:04:41.122020 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-585ms\" (UniqueName: \"kubernetes.io/projected/f8381855-f1fc-4a49-b9b0-17a3731a7463-kube-api-access-585ms\") pod \"calico-apiserver-85f996756d-tfwqr\" (UID: \"f8381855-f1fc-4a49-b9b0-17a3731a7463\") " pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:41.122121 kubelet[2822]: I0213 16:04:41.122111 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lf78\" (UniqueName: \"kubernetes.io/projected/5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5-kube-api-access-8lf78\") pod \"calico-kube-controllers-6cd7f7ffd8-hvv7z\" (UID: \"5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5\") " pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:41.122171 kubelet[2822]: I0213 16:04:41.122164 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3d909c6e-816b-4c0f-b575-428d389c17b0-calico-apiserver-certs\") pod \"calico-apiserver-85f996756d-tldgb\" (UID: \"3d909c6e-816b-4c0f-b575-428d389c17b0\") " pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:41.122249 kubelet[2822]: I0213 16:04:41.122242 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz2dn\" (UniqueName: \"kubernetes.io/projected/4ece0321-0375-43da-8c1b-713662433b6a-kube-api-access-pz2dn\") pod \"coredns-668d6bf9bc-x9p69\" (UID: \"4ece0321-0375-43da-8c1b-713662433b6a\") " pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:41.122361 kubelet[2822]: I0213 16:04:41.122290 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7dfb39fc-a74d-477a-9521-d86660f1b99b-config-volume\") pod \"coredns-668d6bf9bc-znr52\" (UID: \"7dfb39fc-a74d-477a-9521-d86660f1b99b\") " pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:41.122361 kubelet[2822]: I0213 16:04:41.122304 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvtj\" (UniqueName: \"kubernetes.io/projected/7dfb39fc-a74d-477a-9521-d86660f1b99b-kube-api-access-ztvtj\") pod \"coredns-668d6bf9bc-znr52\" (UID: \"7dfb39fc-a74d-477a-9521-d86660f1b99b\") " pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:41.122361 kubelet[2822]: I0213 16:04:41.122312 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw2fv\" (UniqueName: \"kubernetes.io/projected/3d909c6e-816b-4c0f-b575-428d389c17b0-kube-api-access-rw2fv\") pod \"calico-apiserver-85f996756d-tldgb\" (UID: \"3d909c6e-816b-4c0f-b575-428d389c17b0\") " pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:41.122361 kubelet[2822]: I0213 16:04:41.122321 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4ece0321-0375-43da-8c1b-713662433b6a-config-volume\") pod \"coredns-668d6bf9bc-x9p69\" (UID: \"4ece0321-0375-43da-8c1b-713662433b6a\") " pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:41.122361 kubelet[2822]: I0213 16:04:41.122329 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5-tigera-ca-bundle\") pod \"calico-kube-controllers-6cd7f7ffd8-hvv7z\" (UID: \"5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5\") " pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:41.122715 kubelet[2822]: I0213 16:04:41.122340 2822 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f8381855-f1fc-4a49-b9b0-17a3731a7463-calico-apiserver-certs\") pod \"calico-apiserver-85f996756d-tfwqr\" (UID: \"f8381855-f1fc-4a49-b9b0-17a3731a7463\") " pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:41.400443 containerd[1557]: time="2025-02-13T16:04:41.400417697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:0,}" Feb 13 16:04:41.416154 containerd[1557]: time="2025-02-13T16:04:41.416047325Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:0,}" Feb 13 16:04:41.427742 containerd[1557]: time="2025-02-13T16:04:41.427725563Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:0,}" Feb 13 16:04:41.574826 containerd[1557]: time="2025-02-13T16:04:41.574733460Z" level=error msg="Failed to destroy network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.577073 containerd[1557]: time="2025-02-13T16:04:41.576402124Z" level=error msg="Failed to destroy network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.578230 containerd[1557]: time="2025-02-13T16:04:41.578208200Z" level=error msg="encountered an error cleaning up failed sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.578323 containerd[1557]: time="2025-02-13T16:04:41.578311498Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.579596 containerd[1557]: time="2025-02-13T16:04:41.579582480Z" level=error msg="Failed to destroy network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.580232 containerd[1557]: time="2025-02-13T16:04:41.579805400Z" level=error msg="encountered an error cleaning up failed sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.580303 containerd[1557]: time="2025-02-13T16:04:41.580290656Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.581035 containerd[1557]: time="2025-02-13T16:04:41.581022552Z" level=error msg="encountered an error cleaning up failed sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.581131 containerd[1557]: time="2025-02-13T16:04:41.581119864Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.581903 kubelet[2822]: E0213 16:04:41.581876 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.582587 kubelet[2822]: E0213 16:04:41.581933 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:41.582587 kubelet[2822]: E0213 16:04:41.581950 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:41.582587 kubelet[2822]: E0213 16:04:41.581980 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podUID="f8381855-f1fc-4a49-b9b0-17a3731a7463" Feb 13 16:04:41.582717 kubelet[2822]: E0213 16:04:41.582014 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.582717 kubelet[2822]: E0213 16:04:41.582026 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:41.582717 kubelet[2822]: E0213 16:04:41.582034 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:41.582776 kubelet[2822]: E0213 16:04:41.582047 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podUID="5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5" Feb 13 16:04:41.582776 kubelet[2822]: E0213 16:04:41.581877 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.582776 kubelet[2822]: E0213 16:04:41.582064 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:41.582856 kubelet[2822]: E0213 16:04:41.582072 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:41.582856 kubelet[2822]: E0213 16:04:41.582084 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podUID="3d909c6e-816b-4c0f-b575-428d389c17b0" Feb 13 16:04:41.675316 systemd[1]: Created slice kubepods-besteffort-pod4e58e6b2_6097_4f35_ba27_10ac9fc1ce49.slice - libcontainer container kubepods-besteffort-pod4e58e6b2_6097_4f35_ba27_10ac9fc1ce49.slice. Feb 13 16:04:41.676802 containerd[1557]: time="2025-02-13T16:04:41.676779978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:0,}" Feb 13 16:04:41.715599 containerd[1557]: time="2025-02-13T16:04:41.715519886Z" level=error msg="Failed to destroy network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.715906 containerd[1557]: time="2025-02-13T16:04:41.715835633Z" level=error msg="encountered an error cleaning up failed sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.721254 containerd[1557]: time="2025-02-13T16:04:41.721231221Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.721652 kubelet[2822]: E0213 16:04:41.721524 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.721652 kubelet[2822]: E0213 16:04:41.721565 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:41.721652 kubelet[2822]: E0213 16:04:41.721578 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:41.722418 kubelet[2822]: E0213 16:04:41.721791 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:41.793525 kubelet[2822]: I0213 16:04:41.793505 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34" Feb 13 16:04:41.795372 kubelet[2822]: I0213 16:04:41.794222 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5" Feb 13 16:04:41.795540 containerd[1557]: time="2025-02-13T16:04:41.794373150Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:04:41.795675 containerd[1557]: time="2025-02-13T16:04:41.795599975Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:04:41.800621 containerd[1557]: time="2025-02-13T16:04:41.800508599Z" level=info msg="Ensure that sandbox e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5 in task-service has been cleanup successfully" Feb 13 16:04:41.800792 containerd[1557]: time="2025-02-13T16:04:41.800769188Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:04:41.800958 containerd[1557]: time="2025-02-13T16:04:41.800827746Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:04:41.801864 containerd[1557]: time="2025-02-13T16:04:41.801853516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:1,}" Feb 13 16:04:41.803039 containerd[1557]: time="2025-02-13T16:04:41.802776488Z" level=info msg="Ensure that sandbox f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34 in task-service has been cleanup successfully" Feb 13 16:04:41.803039 containerd[1557]: time="2025-02-13T16:04:41.802901495Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:04:41.803039 containerd[1557]: time="2025-02-13T16:04:41.802912970Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:04:41.805897 containerd[1557]: time="2025-02-13T16:04:41.803136867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:1,}" Feb 13 16:04:41.805933 kubelet[2822]: I0213 16:04:41.804817 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae" Feb 13 16:04:41.806668 containerd[1557]: time="2025-02-13T16:04:41.806395891Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:04:41.806668 containerd[1557]: time="2025-02-13T16:04:41.806517174Z" level=info msg="Ensure that sandbox b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae in task-service has been cleanup successfully" Feb 13 16:04:41.808141 containerd[1557]: time="2025-02-13T16:04:41.808101248Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:04:41.808141 containerd[1557]: time="2025-02-13T16:04:41.808115429Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:04:41.809888 containerd[1557]: time="2025-02-13T16:04:41.808873647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 16:04:41.809888 containerd[1557]: time="2025-02-13T16:04:41.809006451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:1,}" Feb 13 16:04:41.811205 kubelet[2822]: I0213 16:04:41.811108 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f" Feb 13 16:04:41.811556 containerd[1557]: time="2025-02-13T16:04:41.811361888Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:04:41.811762 containerd[1557]: time="2025-02-13T16:04:41.811614493Z" level=info msg="Ensure that sandbox 7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f in task-service has been cleanup successfully" Feb 13 16:04:41.811895 containerd[1557]: time="2025-02-13T16:04:41.811879629Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:04:41.811895 containerd[1557]: time="2025-02-13T16:04:41.811890721Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:04:41.813033 containerd[1557]: time="2025-02-13T16:04:41.812938443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:1,}" Feb 13 16:04:41.891302 containerd[1557]: time="2025-02-13T16:04:41.891267843Z" level=error msg="Failed to destroy network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.891503 containerd[1557]: time="2025-02-13T16:04:41.891484028Z" level=error msg="encountered an error cleaning up failed sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.891541 containerd[1557]: time="2025-02-13T16:04:41.891526663Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.891913 containerd[1557]: time="2025-02-13T16:04:41.891684282Z" level=error msg="Failed to destroy network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.891913 containerd[1557]: time="2025-02-13T16:04:41.891870885Z" level=error msg="encountered an error cleaning up failed sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.892087 kubelet[2822]: E0213 16:04:41.891676 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.892087 kubelet[2822]: E0213 16:04:41.891722 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:41.892087 kubelet[2822]: E0213 16:04:41.891745 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:41.892154 containerd[1557]: time="2025-02-13T16:04:41.892001354Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.892196 kubelet[2822]: E0213 16:04:41.891776 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podUID="5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5" Feb 13 16:04:41.892196 kubelet[2822]: E0213 16:04:41.892064 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.892196 kubelet[2822]: E0213 16:04:41.892174 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:41.892278 kubelet[2822]: E0213 16:04:41.892187 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:41.892278 kubelet[2822]: E0213 16:04:41.892205 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podUID="f8381855-f1fc-4a49-b9b0-17a3731a7463" Feb 13 16:04:41.899952 containerd[1557]: time="2025-02-13T16:04:41.899919802Z" level=error msg="Failed to destroy network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.900154 containerd[1557]: time="2025-02-13T16:04:41.900128399Z" level=error msg="encountered an error cleaning up failed sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.900191 containerd[1557]: time="2025-02-13T16:04:41.900179074Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.900332 kubelet[2822]: E0213 16:04:41.900309 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.900368 kubelet[2822]: E0213 16:04:41.900358 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:41.900389 kubelet[2822]: E0213 16:04:41.900372 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:41.900409 kubelet[2822]: E0213 16:04:41.900397 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:41.906060 containerd[1557]: time="2025-02-13T16:04:41.906013257Z" level=error msg="Failed to destroy network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.906251 containerd[1557]: time="2025-02-13T16:04:41.906228700Z" level=error msg="encountered an error cleaning up failed sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.906285 containerd[1557]: time="2025-02-13T16:04:41.906264745Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.906609 kubelet[2822]: E0213 16:04:41.906391 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:41.906609 kubelet[2822]: E0213 16:04:41.906429 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:41.906609 kubelet[2822]: E0213 16:04:41.906441 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:41.906741 kubelet[2822]: E0213 16:04:41.906473 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podUID="3d909c6e-816b-4c0f-b575-428d389c17b0" Feb 13 16:04:41.993015 containerd[1557]: time="2025-02-13T16:04:41.992768029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:0,}" Feb 13 16:04:41.999819 systemd[1]: run-netns-cni\x2db30d375e\x2d6dbc\x2d9ca8\x2dc965\x2da4e908ddaab6.mount: Deactivated successfully. Feb 13 16:04:41.999877 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5-shm.mount: Deactivated successfully. Feb 13 16:04:42.008895 containerd[1557]: time="2025-02-13T16:04:42.008783625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:0,}" Feb 13 16:04:42.043780 containerd[1557]: time="2025-02-13T16:04:42.043747497Z" level=error msg="Failed to destroy network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.044419 containerd[1557]: time="2025-02-13T16:04:42.044154186Z" level=error msg="encountered an error cleaning up failed sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.044419 containerd[1557]: time="2025-02-13T16:04:42.044279803Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.044862 kubelet[2822]: E0213 16:04:42.044522 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.044862 kubelet[2822]: E0213 16:04:42.044578 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:42.044862 kubelet[2822]: E0213 16:04:42.044591 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:42.045854 kubelet[2822]: E0213 16:04:42.044618 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-znr52" podUID="7dfb39fc-a74d-477a-9521-d86660f1b99b" Feb 13 16:04:42.045601 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9-shm.mount: Deactivated successfully. Feb 13 16:04:42.055508 containerd[1557]: time="2025-02-13T16:04:42.055477079Z" level=error msg="Failed to destroy network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.055882 containerd[1557]: time="2025-02-13T16:04:42.055767283Z" level=error msg="encountered an error cleaning up failed sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.055882 containerd[1557]: time="2025-02-13T16:04:42.055802305Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.056254 kubelet[2822]: E0213 16:04:42.056019 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.056254 kubelet[2822]: E0213 16:04:42.056054 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:42.056254 kubelet[2822]: E0213 16:04:42.056067 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:42.056513 kubelet[2822]: E0213 16:04:42.056091 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-x9p69" podUID="4ece0321-0375-43da-8c1b-713662433b6a" Feb 13 16:04:42.814199 kubelet[2822]: I0213 16:04:42.813322 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556" Feb 13 16:04:42.814299 containerd[1557]: time="2025-02-13T16:04:42.813614658Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:04:42.814299 containerd[1557]: time="2025-02-13T16:04:42.813758306Z" level=info msg="Ensure that sandbox 61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556 in task-service has been cleanup successfully" Feb 13 16:04:42.814299 containerd[1557]: time="2025-02-13T16:04:42.813879108Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:04:42.814299 containerd[1557]: time="2025-02-13T16:04:42.813887430Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:04:42.815023 containerd[1557]: time="2025-02-13T16:04:42.814868299Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:04:42.815023 containerd[1557]: time="2025-02-13T16:04:42.814971240Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:04:42.815023 containerd[1557]: time="2025-02-13T16:04:42.814978566Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:04:42.815470 containerd[1557]: time="2025-02-13T16:04:42.815455975Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:2,}" Feb 13 16:04:42.815716 kubelet[2822]: I0213 16:04:42.815605 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816" Feb 13 16:04:42.815950 containerd[1557]: time="2025-02-13T16:04:42.815829093Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:04:42.815987 containerd[1557]: time="2025-02-13T16:04:42.815951866Z" level=info msg="Ensure that sandbox 492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816 in task-service has been cleanup successfully" Feb 13 16:04:42.816259 containerd[1557]: time="2025-02-13T16:04:42.816085801Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:04:42.816259 containerd[1557]: time="2025-02-13T16:04:42.816093391Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:04:42.816299 containerd[1557]: time="2025-02-13T16:04:42.816275827Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:04:42.816429 containerd[1557]: time="2025-02-13T16:04:42.816316454Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:04:42.816429 containerd[1557]: time="2025-02-13T16:04:42.816325606Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:04:42.816939 containerd[1557]: time="2025-02-13T16:04:42.816789063Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:2,}" Feb 13 16:04:42.817961 kubelet[2822]: I0213 16:04:42.817950 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420" Feb 13 16:04:42.818296 containerd[1557]: time="2025-02-13T16:04:42.818232083Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:04:42.818508 containerd[1557]: time="2025-02-13T16:04:42.818444895Z" level=info msg="Ensure that sandbox abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420 in task-service has been cleanup successfully" Feb 13 16:04:42.818726 containerd[1557]: time="2025-02-13T16:04:42.818595786Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:04:42.818726 containerd[1557]: time="2025-02-13T16:04:42.818604769Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:04:42.819205 containerd[1557]: time="2025-02-13T16:04:42.819160417Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:04:42.819446 containerd[1557]: time="2025-02-13T16:04:42.819292893Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:04:42.819446 containerd[1557]: time="2025-02-13T16:04:42.819301879Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:04:42.820993 containerd[1557]: time="2025-02-13T16:04:42.820855104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:2,}" Feb 13 16:04:42.821234 kubelet[2822]: I0213 16:04:42.821220 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a" Feb 13 16:04:42.821688 containerd[1557]: time="2025-02-13T16:04:42.821462847Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:04:42.821688 containerd[1557]: time="2025-02-13T16:04:42.821585003Z" level=info msg="Ensure that sandbox 342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a in task-service has been cleanup successfully" Feb 13 16:04:42.822293 containerd[1557]: time="2025-02-13T16:04:42.822259481Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:04:42.822293 containerd[1557]: time="2025-02-13T16:04:42.822271675Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:04:42.822795 containerd[1557]: time="2025-02-13T16:04:42.822690110Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:04:42.822795 containerd[1557]: time="2025-02-13T16:04:42.822758964Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:04:42.822795 containerd[1557]: time="2025-02-13T16:04:42.822766389Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:04:42.823758 containerd[1557]: time="2025-02-13T16:04:42.823288820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:2,}" Feb 13 16:04:42.825381 kubelet[2822]: I0213 16:04:42.825132 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343" Feb 13 16:04:42.825541 containerd[1557]: time="2025-02-13T16:04:42.825488226Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:04:42.825632 containerd[1557]: time="2025-02-13T16:04:42.825618888Z" level=info msg="Ensure that sandbox b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343 in task-service has been cleanup successfully" Feb 13 16:04:42.826096 containerd[1557]: time="2025-02-13T16:04:42.825764064Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:04:42.826096 containerd[1557]: time="2025-02-13T16:04:42.825775547Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:04:42.827041 containerd[1557]: time="2025-02-13T16:04:42.826913660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:1,}" Feb 13 16:04:42.829563 kubelet[2822]: I0213 16:04:42.829532 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9" Feb 13 16:04:42.832250 containerd[1557]: time="2025-02-13T16:04:42.831988682Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:04:42.833857 containerd[1557]: time="2025-02-13T16:04:42.833763458Z" level=info msg="Ensure that sandbox 4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9 in task-service has been cleanup successfully" Feb 13 16:04:42.834061 containerd[1557]: time="2025-02-13T16:04:42.833950793Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:04:42.834061 containerd[1557]: time="2025-02-13T16:04:42.833973750Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:04:42.835891 containerd[1557]: time="2025-02-13T16:04:42.835724901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:1,}" Feb 13 16:04:42.929022 containerd[1557]: time="2025-02-13T16:04:42.928984733Z" level=error msg="Failed to destroy network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.929919 containerd[1557]: time="2025-02-13T16:04:42.929899272Z" level=error msg="encountered an error cleaning up failed sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.929964 containerd[1557]: time="2025-02-13T16:04:42.929945922Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.930352 kubelet[2822]: E0213 16:04:42.930327 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.930407 kubelet[2822]: E0213 16:04:42.930365 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:42.930407 kubelet[2822]: E0213 16:04:42.930391 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:42.930446 kubelet[2822]: E0213 16:04:42.930416 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podUID="3d909c6e-816b-4c0f-b575-428d389c17b0" Feb 13 16:04:42.948421 containerd[1557]: time="2025-02-13T16:04:42.947804856Z" level=error msg="Failed to destroy network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.949531 containerd[1557]: time="2025-02-13T16:04:42.949500296Z" level=error msg="encountered an error cleaning up failed sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.949741 containerd[1557]: time="2025-02-13T16:04:42.949553240Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.949934 kubelet[2822]: E0213 16:04:42.949913 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.950010 kubelet[2822]: E0213 16:04:42.950001 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:42.950063 kubelet[2822]: E0213 16:04:42.950045 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:42.950152 kubelet[2822]: E0213 16:04:42.950138 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:42.952823 containerd[1557]: time="2025-02-13T16:04:42.952788271Z" level=error msg="Failed to destroy network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.953081 containerd[1557]: time="2025-02-13T16:04:42.953063649Z" level=error msg="encountered an error cleaning up failed sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.953111 containerd[1557]: time="2025-02-13T16:04:42.953099904Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.953440 kubelet[2822]: E0213 16:04:42.953422 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.953669 kubelet[2822]: E0213 16:04:42.953501 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:42.953669 kubelet[2822]: E0213 16:04:42.953517 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:42.953669 kubelet[2822]: E0213 16:04:42.953546 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podUID="5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5" Feb 13 16:04:42.954444 containerd[1557]: time="2025-02-13T16:04:42.954092833Z" level=error msg="Failed to destroy network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.954682 containerd[1557]: time="2025-02-13T16:04:42.954667898Z" level=error msg="encountered an error cleaning up failed sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.954749 containerd[1557]: time="2025-02-13T16:04:42.954737887Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.954996 kubelet[2822]: E0213 16:04:42.954976 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.955037 kubelet[2822]: E0213 16:04:42.955006 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:42.955037 kubelet[2822]: E0213 16:04:42.955018 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:42.955075 kubelet[2822]: E0213 16:04:42.955041 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-x9p69" podUID="4ece0321-0375-43da-8c1b-713662433b6a" Feb 13 16:04:42.963576 containerd[1557]: time="2025-02-13T16:04:42.963498049Z" level=error msg="Failed to destroy network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.963902 containerd[1557]: time="2025-02-13T16:04:42.963876195Z" level=error msg="encountered an error cleaning up failed sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.963988 containerd[1557]: time="2025-02-13T16:04:42.963976456Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.964815 kubelet[2822]: E0213 16:04:42.964693 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.964815 kubelet[2822]: E0213 16:04:42.964727 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:42.964815 kubelet[2822]: E0213 16:04:42.964742 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:42.964926 containerd[1557]: time="2025-02-13T16:04:42.964741595Z" level=error msg="Failed to destroy network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.964950 kubelet[2822]: E0213 16:04:42.964769 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podUID="f8381855-f1fc-4a49-b9b0-17a3731a7463" Feb 13 16:04:42.965188 containerd[1557]: time="2025-02-13T16:04:42.965145780Z" level=error msg="encountered an error cleaning up failed sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.965188 containerd[1557]: time="2025-02-13T16:04:42.965170535Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.965408 kubelet[2822]: E0213 16:04:42.965316 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:42.965408 kubelet[2822]: E0213 16:04:42.965354 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:42.965408 kubelet[2822]: E0213 16:04:42.965364 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:42.965473 kubelet[2822]: E0213 16:04:42.965386 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-znr52" podUID="7dfb39fc-a74d-477a-9521-d86660f1b99b" Feb 13 16:04:42.996417 systemd[1]: run-netns-cni\x2d082df5a2\x2db688\x2d2d3a\x2d44de\x2dc427217ddd61.mount: Deactivated successfully. Feb 13 16:04:42.996478 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343-shm.mount: Deactivated successfully. Feb 13 16:04:42.996519 systemd[1]: run-netns-cni\x2db41500b1\x2dda10\x2d5a3b\x2dabf0\x2d3aafb98c0225.mount: Deactivated successfully. Feb 13 16:04:42.996558 systemd[1]: run-netns-cni\x2dd1f3250a\x2d4bcb\x2d8056\x2d6174\x2d2caff2a55dca.mount: Deactivated successfully. Feb 13 16:04:42.996593 systemd[1]: run-netns-cni\x2da2bc1b85\x2ddc40\x2de482\x2d152a\x2d251e0e0cf736.mount: Deactivated successfully. Feb 13 16:04:42.996628 systemd[1]: run-netns-cni\x2da47fc844\x2da2ca\x2d7e12\x2d3d0a\x2d2a05bd99e08d.mount: Deactivated successfully. Feb 13 16:04:42.996681 systemd[1]: run-netns-cni\x2d9a376714\x2d4bb4\x2daab1\x2d93ae\x2db0c7c0d294bd.mount: Deactivated successfully. Feb 13 16:04:43.832619 kubelet[2822]: I0213 16:04:43.832561 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d" Feb 13 16:04:43.834762 containerd[1557]: time="2025-02-13T16:04:43.834220683Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:04:43.834762 containerd[1557]: time="2025-02-13T16:04:43.834359956Z" level=info msg="Ensure that sandbox 70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d in task-service has been cleanup successfully" Feb 13 16:04:43.837665 containerd[1557]: time="2025-02-13T16:04:43.836179874Z" level=info msg="TearDown network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" successfully" Feb 13 16:04:43.837665 containerd[1557]: time="2025-02-13T16:04:43.836191886Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" returns successfully" Feb 13 16:04:43.837665 containerd[1557]: time="2025-02-13T16:04:43.837472645Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:04:43.837665 containerd[1557]: time="2025-02-13T16:04:43.837521922Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:04:43.837665 containerd[1557]: time="2025-02-13T16:04:43.837528234Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:04:43.837682 systemd[1]: run-netns-cni\x2d67992ee2\x2df329\x2d919c\x2d8f48\x2d8517d2eeecaf.mount: Deactivated successfully. Feb 13 16:04:43.838653 containerd[1557]: time="2025-02-13T16:04:43.838376255Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:04:43.838653 containerd[1557]: time="2025-02-13T16:04:43.838414542Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:04:43.838653 containerd[1557]: time="2025-02-13T16:04:43.838420963Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:04:43.838991 containerd[1557]: time="2025-02-13T16:04:43.838780585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:3,}" Feb 13 16:04:43.839166 kubelet[2822]: I0213 16:04:43.839157 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c" Feb 13 16:04:43.849651 containerd[1557]: time="2025-02-13T16:04:43.849363777Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:04:43.850217 containerd[1557]: time="2025-02-13T16:04:43.850205476Z" level=info msg="Ensure that sandbox c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c in task-service has been cleanup successfully" Feb 13 16:04:43.851454 containerd[1557]: time="2025-02-13T16:04:43.851444015Z" level=info msg="TearDown network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" successfully" Feb 13 16:04:43.851509 containerd[1557]: time="2025-02-13T16:04:43.851501806Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" returns successfully" Feb 13 16:04:43.852580 systemd[1]: run-netns-cni\x2dfd5e2c1f\x2dee34\x2df393\x2d9017\x2d5067d0f375c4.mount: Deactivated successfully. Feb 13 16:04:43.854224 containerd[1557]: time="2025-02-13T16:04:43.854106831Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:04:43.854341 containerd[1557]: time="2025-02-13T16:04:43.854307520Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:04:43.854341 containerd[1557]: time="2025-02-13T16:04:43.854316840Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:04:43.854555 containerd[1557]: time="2025-02-13T16:04:43.854507121Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:04:43.854602 containerd[1557]: time="2025-02-13T16:04:43.854594476Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.854675307Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.855249336Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:3,}" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.855762712Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.855856779Z" level=info msg="Ensure that sandbox 517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf in task-service has been cleanup successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.855965236Z" level=info msg="TearDown network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.855972913Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" returns successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.856279192Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.856541275Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.856550909Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.857629153Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.858658661Z" level=info msg="Ensure that sandbox 27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36 in task-service has been cleanup successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859141954Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859158131Z" level=info msg="TearDown network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859166945Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" returns successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859181432Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859187590Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859282704Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859493098Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859500602Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859536040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:3,}" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859713611Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859804741Z" level=info msg="Ensure that sandbox c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac in task-service has been cleanup successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859885741Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859890317Z" level=info msg="TearDown network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" successfully" Feb 13 16:04:43.860035 containerd[1557]: time="2025-02-13T16:04:43.859989877Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" returns successfully" Feb 13 16:04:43.857174 systemd[1]: run-netns-cni\x2d9adb4c3b\x2dcc91\x2d7ce0\x2d8431\x2d8a4c1ec6d89d.mount: Deactivated successfully. Feb 13 16:04:43.860629 kubelet[2822]: I0213 16:04:43.855428 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf" Feb 13 16:04:43.860629 kubelet[2822]: I0213 16:04:43.857392 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36" Feb 13 16:04:43.860629 kubelet[2822]: I0213 16:04:43.859368 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac" Feb 13 16:04:43.861185 containerd[1557]: time="2025-02-13T16:04:43.860149849Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:04:43.861185 containerd[1557]: time="2025-02-13T16:04:43.860177805Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:04:43.861185 containerd[1557]: time="2025-02-13T16:04:43.860777578Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:04:43.861185 containerd[1557]: time="2025-02-13T16:04:43.860832568Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:04:43.861185 containerd[1557]: time="2025-02-13T16:04:43.860839765Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:04:43.861185 containerd[1557]: time="2025-02-13T16:04:43.861025670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:2,}" Feb 13 16:04:43.860941 systemd[1]: run-netns-cni\x2d77d20935\x2d7000\x2d74f7\x2d1967\x2d7389fed3fbf3.mount: Deactivated successfully. Feb 13 16:04:43.861578 containerd[1557]: time="2025-02-13T16:04:43.861361378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:3,}" Feb 13 16:04:43.880291 kubelet[2822]: I0213 16:04:43.879986 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.880394719Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.880504944Z" level=info msg="Ensure that sandbox 00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7 in task-service has been cleanup successfully" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.880615348Z" level=info msg="TearDown network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" successfully" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.880623538Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" returns successfully" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.880908984Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.880945251Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.880950868Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:04:43.889069 containerd[1557]: time="2025-02-13T16:04:43.881205731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:2,}" Feb 13 16:04:43.994454 systemd[1]: run-netns-cni\x2d2c06f218\x2d7375\x2d342a\x2da6ba\x2dbbd8502a92c0.mount: Deactivated successfully. Feb 13 16:04:43.994547 systemd[1]: run-netns-cni\x2dcc9f9073\x2dac91\x2d0cf7\x2d94b0\x2de8ab98008fb4.mount: Deactivated successfully. Feb 13 16:04:44.398892 containerd[1557]: time="2025-02-13T16:04:44.398781188Z" level=error msg="Failed to destroy network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.399747 containerd[1557]: time="2025-02-13T16:04:44.399010123Z" level=error msg="Failed to destroy network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.399968 containerd[1557]: time="2025-02-13T16:04:44.399955350Z" level=error msg="encountered an error cleaning up failed sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.400167 containerd[1557]: time="2025-02-13T16:04:44.400031975Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.400220 kubelet[2822]: E0213 16:04:44.400182 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.400251 kubelet[2822]: E0213 16:04:44.400226 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:44.400251 kubelet[2822]: E0213 16:04:44.400242 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:44.400292 kubelet[2822]: E0213 16:04:44.400271 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podUID="3d909c6e-816b-4c0f-b575-428d389c17b0" Feb 13 16:04:44.402167 containerd[1557]: time="2025-02-13T16:04:44.402038100Z" level=error msg="encountered an error cleaning up failed sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.402523 containerd[1557]: time="2025-02-13T16:04:44.402074611Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.402740 kubelet[2822]: E0213 16:04:44.402629 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.402740 kubelet[2822]: E0213 16:04:44.402696 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:44.402740 kubelet[2822]: E0213 16:04:44.402711 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:44.402930 kubelet[2822]: E0213 16:04:44.402906 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-znr52" podUID="7dfb39fc-a74d-477a-9521-d86660f1b99b" Feb 13 16:04:44.410087 containerd[1557]: time="2025-02-13T16:04:44.410061562Z" level=error msg="Failed to destroy network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.410818 containerd[1557]: time="2025-02-13T16:04:44.410803136Z" level=error msg="encountered an error cleaning up failed sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.411074 containerd[1557]: time="2025-02-13T16:04:44.411060761Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.411653 kubelet[2822]: E0213 16:04:44.411214 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.411653 kubelet[2822]: E0213 16:04:44.411320 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:44.411653 kubelet[2822]: E0213 16:04:44.411333 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:44.411733 kubelet[2822]: E0213 16:04:44.411360 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-x9p69" podUID="4ece0321-0375-43da-8c1b-713662433b6a" Feb 13 16:04:44.421321 containerd[1557]: time="2025-02-13T16:04:44.421294026Z" level=error msg="Failed to destroy network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.422813 containerd[1557]: time="2025-02-13T16:04:44.422710382Z" level=error msg="encountered an error cleaning up failed sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.423038 containerd[1557]: time="2025-02-13T16:04:44.422942010Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.423189 containerd[1557]: time="2025-02-13T16:04:44.422968722Z" level=error msg="Failed to destroy network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.423414 kubelet[2822]: E0213 16:04:44.423393 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.423609 kubelet[2822]: E0213 16:04:44.423504 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:44.423609 kubelet[2822]: E0213 16:04:44.423520 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:44.423609 kubelet[2822]: E0213 16:04:44.423562 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podUID="f8381855-f1fc-4a49-b9b0-17a3731a7463" Feb 13 16:04:44.424255 containerd[1557]: time="2025-02-13T16:04:44.423809297Z" level=error msg="encountered an error cleaning up failed sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.424255 containerd[1557]: time="2025-02-13T16:04:44.423847037Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.424695 kubelet[2822]: E0213 16:04:44.424683 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.424763 kubelet[2822]: E0213 16:04:44.424754 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:44.424899 kubelet[2822]: E0213 16:04:44.424830 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:44.424899 kubelet[2822]: E0213 16:04:44.424860 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podUID="5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5" Feb 13 16:04:44.425697 containerd[1557]: time="2025-02-13T16:04:44.425527387Z" level=error msg="Failed to destroy network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.425796 containerd[1557]: time="2025-02-13T16:04:44.425737228Z" level=error msg="encountered an error cleaning up failed sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.425796 containerd[1557]: time="2025-02-13T16:04:44.425776117Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.426085 kubelet[2822]: E0213 16:04:44.425899 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:44.426085 kubelet[2822]: E0213 16:04:44.425926 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:44.426085 kubelet[2822]: E0213 16:04:44.425939 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:44.426156 kubelet[2822]: E0213 16:04:44.425962 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:44.883103 kubelet[2822]: I0213 16:04:44.883047 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f" Feb 13 16:04:44.884885 containerd[1557]: time="2025-02-13T16:04:44.884812954Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" Feb 13 16:04:44.885653 containerd[1557]: time="2025-02-13T16:04:44.885598597Z" level=info msg="Ensure that sandbox 5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f in task-service has been cleanup successfully" Feb 13 16:04:44.885953 containerd[1557]: time="2025-02-13T16:04:44.885938600Z" level=info msg="TearDown network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" successfully" Feb 13 16:04:44.885953 containerd[1557]: time="2025-02-13T16:04:44.885951520Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" returns successfully" Feb 13 16:04:44.886897 containerd[1557]: time="2025-02-13T16:04:44.886881366Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:04:44.887070 containerd[1557]: time="2025-02-13T16:04:44.887056328Z" level=info msg="TearDown network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" successfully" Feb 13 16:04:44.887310 containerd[1557]: time="2025-02-13T16:04:44.887292895Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" returns successfully" Feb 13 16:04:44.887979 containerd[1557]: time="2025-02-13T16:04:44.887690604Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:04:44.887979 containerd[1557]: time="2025-02-13T16:04:44.887748292Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:04:44.887979 containerd[1557]: time="2025-02-13T16:04:44.887756039Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:04:44.888048 kubelet[2822]: I0213 16:04:44.887756 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf" Feb 13 16:04:44.888073 containerd[1557]: time="2025-02-13T16:04:44.887987381Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" Feb 13 16:04:44.888092 containerd[1557]: time="2025-02-13T16:04:44.888076610Z" level=info msg="Ensure that sandbox 620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf in task-service has been cleanup successfully" Feb 13 16:04:44.888407 containerd[1557]: time="2025-02-13T16:04:44.888395133Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:04:44.888438 containerd[1557]: time="2025-02-13T16:04:44.888432338Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:04:44.888462 containerd[1557]: time="2025-02-13T16:04:44.888438411Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:04:44.889169 containerd[1557]: time="2025-02-13T16:04:44.888550173Z" level=info msg="TearDown network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" successfully" Feb 13 16:04:44.889169 containerd[1557]: time="2025-02-13T16:04:44.888568724Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" returns successfully" Feb 13 16:04:44.889626 containerd[1557]: time="2025-02-13T16:04:44.889592306Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:4,}" Feb 13 16:04:44.890083 containerd[1557]: time="2025-02-13T16:04:44.890055749Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:04:44.890122 containerd[1557]: time="2025-02-13T16:04:44.890111680Z" level=info msg="TearDown network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" successfully" Feb 13 16:04:44.890122 containerd[1557]: time="2025-02-13T16:04:44.890120090Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" returns successfully" Feb 13 16:04:44.890649 containerd[1557]: time="2025-02-13T16:04:44.890626436Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:04:44.890705 containerd[1557]: time="2025-02-13T16:04:44.890689958Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:04:44.890705 containerd[1557]: time="2025-02-13T16:04:44.890700854Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:04:44.891316 containerd[1557]: time="2025-02-13T16:04:44.891230144Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:04:44.891423 containerd[1557]: time="2025-02-13T16:04:44.891361823Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:04:44.891423 containerd[1557]: time="2025-02-13T16:04:44.891369208Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:04:44.892121 containerd[1557]: time="2025-02-13T16:04:44.892107649Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:4,}" Feb 13 16:04:44.893156 kubelet[2822]: I0213 16:04:44.893148 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8" Feb 13 16:04:44.894190 containerd[1557]: time="2025-02-13T16:04:44.894175332Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" Feb 13 16:04:44.894311 containerd[1557]: time="2025-02-13T16:04:44.894293061Z" level=info msg="Ensure that sandbox 1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8 in task-service has been cleanup successfully" Feb 13 16:04:44.895031 containerd[1557]: time="2025-02-13T16:04:44.895002450Z" level=info msg="TearDown network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" successfully" Feb 13 16:04:44.895031 containerd[1557]: time="2025-02-13T16:04:44.895012630Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" returns successfully" Feb 13 16:04:44.895573 containerd[1557]: time="2025-02-13T16:04:44.895559424Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:04:44.895670 containerd[1557]: time="2025-02-13T16:04:44.895637936Z" level=info msg="TearDown network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" successfully" Feb 13 16:04:44.895670 containerd[1557]: time="2025-02-13T16:04:44.895668933Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" returns successfully" Feb 13 16:04:44.897650 containerd[1557]: time="2025-02-13T16:04:44.897382280Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:04:44.897650 containerd[1557]: time="2025-02-13T16:04:44.897424201Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:04:44.897650 containerd[1557]: time="2025-02-13T16:04:44.897430406Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:04:44.898522 containerd[1557]: time="2025-02-13T16:04:44.898174327Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:04:44.898522 containerd[1557]: time="2025-02-13T16:04:44.898222759Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:04:44.898522 containerd[1557]: time="2025-02-13T16:04:44.898229883Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:04:44.898620 kubelet[2822]: I0213 16:04:44.898399 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870" Feb 13 16:04:44.898805 containerd[1557]: time="2025-02-13T16:04:44.898792224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:4,}" Feb 13 16:04:44.898982 containerd[1557]: time="2025-02-13T16:04:44.898970667Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" Feb 13 16:04:44.899272 containerd[1557]: time="2025-02-13T16:04:44.899136126Z" level=info msg="Ensure that sandbox dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870 in task-service has been cleanup successfully" Feb 13 16:04:44.900703 containerd[1557]: time="2025-02-13T16:04:44.900663561Z" level=info msg="TearDown network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" successfully" Feb 13 16:04:44.900703 containerd[1557]: time="2025-02-13T16:04:44.900674501Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" returns successfully" Feb 13 16:04:44.901500 containerd[1557]: time="2025-02-13T16:04:44.901452251Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:04:44.901664 containerd[1557]: time="2025-02-13T16:04:44.901546251Z" level=info msg="TearDown network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" successfully" Feb 13 16:04:44.902314 containerd[1557]: time="2025-02-13T16:04:44.901707645Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" returns successfully" Feb 13 16:04:44.904921 containerd[1557]: time="2025-02-13T16:04:44.904874747Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:04:44.904959 containerd[1557]: time="2025-02-13T16:04:44.904924569Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:04:44.904959 containerd[1557]: time="2025-02-13T16:04:44.904931119Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:04:44.906478 containerd[1557]: time="2025-02-13T16:04:44.905305479Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:04:44.906478 containerd[1557]: time="2025-02-13T16:04:44.905345830Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:04:44.906478 containerd[1557]: time="2025-02-13T16:04:44.905352356Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:04:44.906579 kubelet[2822]: I0213 16:04:44.906254 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665" Feb 13 16:04:44.907649 containerd[1557]: time="2025-02-13T16:04:44.907000740Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" Feb 13 16:04:44.907821 containerd[1557]: time="2025-02-13T16:04:44.907808369Z" level=info msg="Ensure that sandbox e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665 in task-service has been cleanup successfully" Feb 13 16:04:44.908169 containerd[1557]: time="2025-02-13T16:04:44.908158185Z" level=info msg="TearDown network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" successfully" Feb 13 16:04:44.908213 containerd[1557]: time="2025-02-13T16:04:44.908205917Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" returns successfully" Feb 13 16:04:44.910200 containerd[1557]: time="2025-02-13T16:04:44.909970014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:4,}" Feb 13 16:04:44.910577 containerd[1557]: time="2025-02-13T16:04:44.910562605Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:04:44.910629 containerd[1557]: time="2025-02-13T16:04:44.910615554Z" level=info msg="TearDown network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" successfully" Feb 13 16:04:44.910629 containerd[1557]: time="2025-02-13T16:04:44.910625503Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" returns successfully" Feb 13 16:04:44.912471 containerd[1557]: time="2025-02-13T16:04:44.912433288Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:04:44.912514 containerd[1557]: time="2025-02-13T16:04:44.912495321Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:04:44.912514 containerd[1557]: time="2025-02-13T16:04:44.912503675Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:04:44.913536 containerd[1557]: time="2025-02-13T16:04:44.913189630Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:3,}" Feb 13 16:04:44.915569 kubelet[2822]: I0213 16:04:44.915556 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323" Feb 13 16:04:44.916285 containerd[1557]: time="2025-02-13T16:04:44.915936014Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" Feb 13 16:04:44.916285 containerd[1557]: time="2025-02-13T16:04:44.916047556Z" level=info msg="Ensure that sandbox 0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323 in task-service has been cleanup successfully" Feb 13 16:04:44.916285 containerd[1557]: time="2025-02-13T16:04:44.916179120Z" level=info msg="TearDown network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" successfully" Feb 13 16:04:44.916285 containerd[1557]: time="2025-02-13T16:04:44.916187765Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" returns successfully" Feb 13 16:04:44.917635 containerd[1557]: time="2025-02-13T16:04:44.917619707Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:04:44.917695 containerd[1557]: time="2025-02-13T16:04:44.917683484Z" level=info msg="TearDown network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" successfully" Feb 13 16:04:44.917718 containerd[1557]: time="2025-02-13T16:04:44.917694598Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" returns successfully" Feb 13 16:04:44.917852 containerd[1557]: time="2025-02-13T16:04:44.917831416Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:04:44.917896 containerd[1557]: time="2025-02-13T16:04:44.917880708Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:04:44.917896 containerd[1557]: time="2025-02-13T16:04:44.917888131Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:04:44.918097 containerd[1557]: time="2025-02-13T16:04:44.918084353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:3,}" Feb 13 16:04:45.002158 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870-shm.mount: Deactivated successfully. Feb 13 16:04:45.002393 systemd[1]: run-netns-cni\x2d534b8635\x2d0c59\x2d54eb\x2deac4\x2dcd2f33cd22d8.mount: Deactivated successfully. Feb 13 16:04:45.002432 systemd[1]: run-netns-cni\x2d0b1f0b8b\x2d5358\x2d3152\x2d671e\x2d6d30a56e60f9.mount: Deactivated successfully. Feb 13 16:04:45.002467 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323-shm.mount: Deactivated successfully. Feb 13 16:04:45.002505 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf-shm.mount: Deactivated successfully. Feb 13 16:04:45.002544 systemd[1]: run-netns-cni\x2d398b2b72\x2d9b8c\x2db671\x2d2d85\x2df0c5dbddb471.mount: Deactivated successfully. Feb 13 16:04:45.002580 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8-shm.mount: Deactivated successfully. Feb 13 16:04:45.002616 systemd[1]: run-netns-cni\x2d4744f410\x2d03e3\x2dc3d4\x2d263b\x2d38426afad2dd.mount: Deactivated successfully. Feb 13 16:04:45.002658 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f-shm.mount: Deactivated successfully. Feb 13 16:04:45.047921 containerd[1557]: time="2025-02-13T16:04:45.047844708Z" level=error msg="Failed to destroy network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.048558 containerd[1557]: time="2025-02-13T16:04:45.048136860Z" level=error msg="encountered an error cleaning up failed sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.048558 containerd[1557]: time="2025-02-13T16:04:45.048173157Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.051301 kubelet[2822]: E0213 16:04:45.049186 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.051301 kubelet[2822]: E0213 16:04:45.049223 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:45.051301 kubelet[2822]: E0213 16:04:45.049238 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:45.050040 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f-shm.mount: Deactivated successfully. Feb 13 16:04:45.051464 kubelet[2822]: E0213 16:04:45.049264 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podUID="5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5" Feb 13 16:04:45.054264 containerd[1557]: time="2025-02-13T16:04:45.054191812Z" level=error msg="Failed to destroy network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.055333 containerd[1557]: time="2025-02-13T16:04:45.055240790Z" level=error msg="encountered an error cleaning up failed sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.055333 containerd[1557]: time="2025-02-13T16:04:45.055278538Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.057153 kubelet[2822]: E0213 16:04:45.055403 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.057153 kubelet[2822]: E0213 16:04:45.055436 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:45.057153 kubelet[2822]: E0213 16:04:45.055450 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:45.057235 kubelet[2822]: E0213 16:04:45.055475 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:45.058258 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db-shm.mount: Deactivated successfully. Feb 13 16:04:45.065380 containerd[1557]: time="2025-02-13T16:04:45.065351176Z" level=error msg="Failed to destroy network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.065620 containerd[1557]: time="2025-02-13T16:04:45.065601199Z" level=error msg="Failed to destroy network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.066849 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c-shm.mount: Deactivated successfully. Feb 13 16:04:45.067741 containerd[1557]: time="2025-02-13T16:04:45.067541306Z" level=error msg="encountered an error cleaning up failed sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.068053 containerd[1557]: time="2025-02-13T16:04:45.067941811Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.069609 containerd[1557]: time="2025-02-13T16:04:45.068227817Z" level=error msg="encountered an error cleaning up failed sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.069609 containerd[1557]: time="2025-02-13T16:04:45.068259564Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.070290 kubelet[2822]: E0213 16:04:45.068979 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.070290 kubelet[2822]: E0213 16:04:45.069020 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:45.070290 kubelet[2822]: E0213 16:04:45.069043 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:45.070628 kubelet[2822]: E0213 16:04:45.069076 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-znr52" podUID="7dfb39fc-a74d-477a-9521-d86660f1b99b" Feb 13 16:04:45.070628 kubelet[2822]: E0213 16:04:45.069983 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.070628 kubelet[2822]: E0213 16:04:45.070013 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:45.070719 kubelet[2822]: E0213 16:04:45.070028 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:45.070719 kubelet[2822]: E0213 16:04:45.070047 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podUID="f8381855-f1fc-4a49-b9b0-17a3731a7463" Feb 13 16:04:45.071227 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44-shm.mount: Deactivated successfully. Feb 13 16:04:45.088043 containerd[1557]: time="2025-02-13T16:04:45.088008314Z" level=error msg="Failed to destroy network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.088529 containerd[1557]: time="2025-02-13T16:04:45.088445052Z" level=error msg="encountered an error cleaning up failed sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.088696 containerd[1557]: time="2025-02-13T16:04:45.088680362Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.089150 kubelet[2822]: E0213 16:04:45.089092 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.089481 kubelet[2822]: E0213 16:04:45.089291 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:45.089481 kubelet[2822]: E0213 16:04:45.089310 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:45.089481 kubelet[2822]: E0213 16:04:45.089340 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podUID="3d909c6e-816b-4c0f-b575-428d389c17b0" Feb 13 16:04:45.095145 containerd[1557]: time="2025-02-13T16:04:45.095071935Z" level=error msg="Failed to destroy network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.095940 containerd[1557]: time="2025-02-13T16:04:45.095883491Z" level=error msg="encountered an error cleaning up failed sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.096438 containerd[1557]: time="2025-02-13T16:04:45.095923538Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.096877 kubelet[2822]: E0213 16:04:45.096666 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:45.096877 kubelet[2822]: E0213 16:04:45.096701 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:45.096877 kubelet[2822]: E0213 16:04:45.096717 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:45.096954 kubelet[2822]: E0213 16:04:45.096744 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-x9p69" podUID="4ece0321-0375-43da-8c1b-713662433b6a" Feb 13 16:04:45.919351 kubelet[2822]: I0213 16:04:45.919335 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f" Feb 13 16:04:45.937464 kubelet[2822]: I0213 16:04:45.925370 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.919675397Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\"" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.920049219Z" level=info msg="Ensure that sandbox 3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f in task-service has been cleanup successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.920473199Z" level=info msg="TearDown network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.920482634Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" returns successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.921143518Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.921183752Z" level=info msg="TearDown network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.921190812Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" returns successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.921724682Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.921762672Z" level=info msg="TearDown network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.921768329Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" returns successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.922160851Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.922193413Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.922200147Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.922678734Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.922715997Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.922721748Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.922996643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:5,}" Feb 13 16:04:45.937520 containerd[1557]: time="2025-02-13T16:04:45.925635328Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\"" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.938791103Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\"" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.939893606Z" level=info msg="Ensure that sandbox 5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f in task-service has been cleanup successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.940012100Z" level=info msg="TearDown network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.940020089Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" returns successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.940552618Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.940592525Z" level=info msg="TearDown network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.940598534Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" returns successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.941000095Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.941034790Z" level=info msg="TearDown network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.941040449Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" returns successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.941174075Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.941707876Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.941716539Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.942880575Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.942917925Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.942923916Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.943039026Z" level=info msg="Ensure that sandbox 56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb in task-service has been cleanup successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.943132642Z" level=info msg="TearDown network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.943140153Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" returns successfully" Feb 13 16:04:45.943463 containerd[1557]: time="2025-02-13T16:04:45.943304264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:5,}" Feb 13 16:04:45.951385 kubelet[2822]: I0213 16:04:45.938135 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb" Feb 13 16:04:45.951385 kubelet[2822]: I0213 16:04:45.944181 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.943942734Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.943984502Z" level=info msg="TearDown network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" successfully" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.943990798Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" returns successfully" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.944431526Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.944470928Z" level=info msg="TearDown network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" successfully" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.944489655Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" returns successfully" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.951190613Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.951237476Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.951244895Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:04:45.959707 containerd[1557]: time="2025-02-13T16:04:45.952070104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:4,}" Feb 13 16:04:45.959891 kubelet[2822]: I0213 16:04:45.957698 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db" Feb 13 16:04:45.985190 kubelet[2822]: I0213 16:04:45.985070 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c" Feb 13 16:04:45.995088 systemd[1]: run-netns-cni\x2d5f9ec9f5\x2dd90f\x2d249e\x2d2815\x2daf6fae440706.mount: Deactivated successfully. Feb 13 16:04:45.995155 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb-shm.mount: Deactivated successfully. Feb 13 16:04:45.995197 systemd[1]: run-netns-cni\x2deebc67b8\x2dc206\x2d5565\x2d6172\x2d6bb4824402e7.mount: Deactivated successfully. Feb 13 16:04:45.995234 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f-shm.mount: Deactivated successfully. Feb 13 16:04:45.995271 systemd[1]: run-netns-cni\x2d06beff48\x2d20e0\x2d878d\x2d0ec8\x2d02d8e0769881.mount: Deactivated successfully. Feb 13 16:04:46.098319 containerd[1557]: time="2025-02-13T16:04:46.098197820Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\"" Feb 13 16:04:46.100539 containerd[1557]: time="2025-02-13T16:04:46.099145237Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\"" Feb 13 16:04:46.101527 containerd[1557]: time="2025-02-13T16:04:46.101433684Z" level=info msg="Ensure that sandbox e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44 in task-service has been cleanup successfully" Feb 13 16:04:46.102460 containerd[1557]: time="2025-02-13T16:04:46.101662813Z" level=info msg="TearDown network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" successfully" Feb 13 16:04:46.102460 containerd[1557]: time="2025-02-13T16:04:46.101672932Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" returns successfully" Feb 13 16:04:46.102460 containerd[1557]: time="2025-02-13T16:04:46.102051536Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" Feb 13 16:04:46.102460 containerd[1557]: time="2025-02-13T16:04:46.102092280Z" level=info msg="TearDown network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" successfully" Feb 13 16:04:46.102460 containerd[1557]: time="2025-02-13T16:04:46.102098328Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" returns successfully" Feb 13 16:04:46.103125 containerd[1557]: time="2025-02-13T16:04:46.102776749Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\"" Feb 13 16:04:46.103971 systemd[1]: run-netns-cni\x2d02f579a8\x2de586\x2ded28\x2de619\x2dd07060bf1c57.mount: Deactivated successfully. Feb 13 16:04:46.106272 systemd[1]: run-netns-cni\x2da390705f\x2dd784\x2d36e0\x2dc16c\x2d3aa800c48868.mount: Deactivated successfully. Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.105050509Z" level=info msg="Ensure that sandbox fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c in task-service has been cleanup successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.105616317Z" level=info msg="TearDown network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.105626308Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.107472915Z" level=info msg="Ensure that sandbox abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db in task-service has been cleanup successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.107822337Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.107875287Z" level=info msg="TearDown network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.107881726Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.107915222Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.107945141Z" level=info msg="TearDown network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.107950363Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108100200Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108150823Z" level=info msg="TearDown network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108158350Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108185201Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108229605Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108235398Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108427586Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108478570Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108485978Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.108547102Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:4,}" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.109229730Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.109492354Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.109500712Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.109818990Z" level=info msg="TearDown network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.109827059Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.109964490Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110003469Z" level=info msg="TearDown network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110010428Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110056069Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:5,}" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110257113Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110294874Z" level=info msg="TearDown network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110300510Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110415053Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110451753Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110458286Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110587856Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:04:46.121162 containerd[1557]: time="2025-02-13T16:04:46.110621384Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:04:46.111668 systemd[1]: run-netns-cni\x2d3118a1d9\x2d7026\x2d54fa\x2db349\x2d8d8a21c8e7cf.mount: Deactivated successfully. Feb 13 16:04:46.128715 containerd[1557]: time="2025-02-13T16:04:46.110627326Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:04:46.128715 containerd[1557]: time="2025-02-13T16:04:46.110828450Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:5,}" Feb 13 16:04:46.260667 containerd[1557]: time="2025-02-13T16:04:46.260040712Z" level=error msg="Failed to destroy network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.260667 containerd[1557]: time="2025-02-13T16:04:46.260428515Z" level=error msg="encountered an error cleaning up failed sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.260667 containerd[1557]: time="2025-02-13T16:04:46.260603695Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.262462 kubelet[2822]: E0213 16:04:46.261857 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.262462 kubelet[2822]: E0213 16:04:46.261895 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:46.262462 kubelet[2822]: E0213 16:04:46.261909 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:46.262592 kubelet[2822]: E0213 16:04:46.261940 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-x9p69" podUID="4ece0321-0375-43da-8c1b-713662433b6a" Feb 13 16:04:46.270142 containerd[1557]: time="2025-02-13T16:04:46.270076995Z" level=error msg="Failed to destroy network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.274278 containerd[1557]: time="2025-02-13T16:04:46.274088830Z" level=error msg="encountered an error cleaning up failed sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.274278 containerd[1557]: time="2025-02-13T16:04:46.274127636Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.274663 kubelet[2822]: E0213 16:04:46.274462 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.274663 kubelet[2822]: E0213 16:04:46.274514 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:46.274663 kubelet[2822]: E0213 16:04:46.274532 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:46.274746 kubelet[2822]: E0213 16:04:46.274562 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podUID="f8381855-f1fc-4a49-b9b0-17a3731a7463" Feb 13 16:04:46.290225 containerd[1557]: time="2025-02-13T16:04:46.290185669Z" level=error msg="Failed to destroy network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.290570 containerd[1557]: time="2025-02-13T16:04:46.290553333Z" level=error msg="encountered an error cleaning up failed sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.290660 containerd[1557]: time="2025-02-13T16:04:46.290600583Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.290804 kubelet[2822]: E0213 16:04:46.290776 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.290843 kubelet[2822]: E0213 16:04:46.290818 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:46.290843 kubelet[2822]: E0213 16:04:46.290832 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:46.290948 kubelet[2822]: E0213 16:04:46.290867 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-znr52" podUID="7dfb39fc-a74d-477a-9521-d86660f1b99b" Feb 13 16:04:46.306733 containerd[1557]: time="2025-02-13T16:04:46.306698676Z" level=error msg="Failed to destroy network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.307136 containerd[1557]: time="2025-02-13T16:04:46.307045289Z" level=error msg="encountered an error cleaning up failed sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.307136 containerd[1557]: time="2025-02-13T16:04:46.307083259Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.320582 kubelet[2822]: E0213 16:04:46.307280 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.320582 kubelet[2822]: E0213 16:04:46.319555 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:46.320582 kubelet[2822]: E0213 16:04:46.319576 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:46.320731 kubelet[2822]: E0213 16:04:46.319604 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:46.322024 containerd[1557]: time="2025-02-13T16:04:46.322005043Z" level=error msg="Failed to destroy network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.322291 containerd[1557]: time="2025-02-13T16:04:46.322277497Z" level=error msg="encountered an error cleaning up failed sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.322503 containerd[1557]: time="2025-02-13T16:04:46.322377623Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.322654 kubelet[2822]: E0213 16:04:46.322556 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.322654 kubelet[2822]: E0213 16:04:46.322583 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:46.322654 kubelet[2822]: E0213 16:04:46.322594 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:46.322732 kubelet[2822]: E0213 16:04:46.322621 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podUID="3d909c6e-816b-4c0f-b575-428d389c17b0" Feb 13 16:04:46.330981 containerd[1557]: time="2025-02-13T16:04:46.330954195Z" level=error msg="Failed to destroy network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.331414 containerd[1557]: time="2025-02-13T16:04:46.331395431Z" level=error msg="encountered an error cleaning up failed sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.331444 containerd[1557]: time="2025-02-13T16:04:46.331433201Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.331652 kubelet[2822]: E0213 16:04:46.331559 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:46.331652 kubelet[2822]: E0213 16:04:46.331592 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:46.331652 kubelet[2822]: E0213 16:04:46.331607 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:46.332251 kubelet[2822]: E0213 16:04:46.331631 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podUID="5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5" Feb 13 16:04:46.965925 containerd[1557]: time="2025-02-13T16:04:46.962148172Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:46.976792 containerd[1557]: time="2025-02-13T16:04:46.976761986Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 16:04:46.988943 kubelet[2822]: I0213 16:04:46.988551 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b" Feb 13 16:04:46.989179 containerd[1557]: time="2025-02-13T16:04:46.988926730Z" level=info msg="StopPodSandbox for \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.991816632Z" level=info msg="Ensure that sandbox 32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b in task-service has been cleanup successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.992114310Z" level=info msg="TearDown network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.992232114Z" level=info msg="StopPodSandbox for \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.992427805Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.992474196Z" level=info msg="TearDown network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.992481892Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.992850849Z" level=info msg="StopPodSandbox for \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.993768740Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.993816692Z" level=info msg="TearDown network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.993826296Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995028798Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995078344Z" level=info msg="TearDown network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995085714Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995419285Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995462098Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995468770Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995777958Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995815979Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.995822983Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.997418914Z" level=info msg="Ensure that sandbox b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777 in task-service has been cleanup successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.997752159Z" level=info msg="TearDown network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.997761459Z" level=info msg="StopPodSandbox for \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:46.998933285Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.000664393Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:6,}" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.000696814Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.000739457Z" level=info msg="TearDown network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.000746457Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.000868757Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.000918698Z" level=info msg="TearDown network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.000925778Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.001115234Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.001163002Z" level=info msg="TearDown network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.001169967Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.002015673Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.002137360Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.002144653Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.002312061Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:04:47.003371 containerd[1557]: time="2025-02-13T16:04:47.002479025Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:04:46.996609 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33-shm.mount: Deactivated successfully. Feb 13 16:04:47.013830 kubelet[2822]: I0213 16:04:46.992459 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777" Feb 13 16:04:47.013830 kubelet[2822]: I0213 16:04:47.002173 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33" Feb 13 16:04:47.013830 kubelet[2822]: I0213 16:04:47.006766 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0" Feb 13 16:04:47.013830 kubelet[2822]: I0213 16:04:47.008162 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.002486157Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.002832684Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:6,}" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.003626900Z" level=info msg="StopPodSandbox for \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.003852865Z" level=info msg="Ensure that sandbox a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33 in task-service has been cleanup successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.004481698Z" level=info msg="TearDown network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.004491409Z" level=info msg="StopPodSandbox for \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.005476004Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.005524614Z" level=info msg="TearDown network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.005539520Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006064581Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006099994Z" level=info msg="TearDown network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006105834Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006226278Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006265087Z" level=info msg="TearDown network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006271434Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006397342Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006432877Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006438663Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.006707120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:5,}" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.007033707Z" level=info msg="StopPodSandbox for \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.008531639Z" level=info msg="StopPodSandbox for \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.008629096Z" level=info msg="Ensure that sandbox 6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b in task-service has been cleanup successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.008765636Z" level=info msg="TearDown network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.008773740Z" level=info msg="StopPodSandbox for \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.009341429Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.009380226Z" level=info msg="TearDown network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.009388329Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.009529267Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.010190877Z" level=info msg="TearDown network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.010199237Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.011757181Z" level=info msg="Ensure that sandbox aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0 in task-service has been cleanup successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.012569201Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.012617135Z" level=info msg="TearDown network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.012623831Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.012712404Z" level=info msg="TearDown network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.012720362Z" level=info msg="StopPodSandbox for \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" returns successfully" Feb 13 16:04:47.013909 containerd[1557]: time="2025-02-13T16:04:47.012852832Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:46.996688 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount749748970.mount: Deactivated successfully. Feb 13 16:04:47.014538 containerd[1557]: time="2025-02-13T16:04:47.012998970Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\"" Feb 13 16:04:47.014538 containerd[1557]: time="2025-02-13T16:04:47.013274189Z" level=info msg="TearDown network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" successfully" Feb 13 16:04:47.014538 containerd[1557]: time="2025-02-13T16:04:47.013281868Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" returns successfully" Feb 13 16:04:46.999429 systemd[1]: run-netns-cni\x2d36828ffd\x2d49dc\x2d36a9\x2de96c\x2daa06a685332a.mount: Deactivated successfully. Feb 13 16:04:47.005191 systemd[1]: run-netns-cni\x2d4e4d5e30\x2df2f6\x2d367e\x2d301d\x2d08678951771f.mount: Deactivated successfully. Feb 13 16:04:47.010720 systemd[1]: run-netns-cni\x2d33a71eaf\x2d45e3\x2d587d\x2d3add\x2db48f4e5ed10c.mount: Deactivated successfully. Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.014610237Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.014754361Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.014763148Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.015111829Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.015149453Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.015385405Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.015417646Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.015673154Z" level=info msg="TearDown network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" successfully" Feb 13 16:04:47.016224 containerd[1557]: time="2025-02-13T16:04:47.015711563Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" returns successfully" Feb 13 16:04:47.013533 systemd[1]: run-netns-cni\x2d9a11fb6c\x2d032b\x2dd167\x2daf87\x2dbc94e748f15d.mount: Deactivated successfully. Feb 13 16:04:47.016789 containerd[1557]: time="2025-02-13T16:04:47.016718268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:6,}" Feb 13 16:04:47.016911 containerd[1557]: time="2025-02-13T16:04:47.016901440Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:04:47.016996 containerd[1557]: time="2025-02-13T16:04:47.016987915Z" level=info msg="TearDown network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" successfully" Feb 13 16:04:47.017206 containerd[1557]: time="2025-02-13T16:04:47.017025818Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" returns successfully" Feb 13 16:04:47.017300 containerd[1557]: time="2025-02-13T16:04:47.017290865Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:04:47.017384 containerd[1557]: time="2025-02-13T16:04:47.017370385Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:04:47.017499 containerd[1557]: time="2025-02-13T16:04:47.017412855Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:04:47.017793 kubelet[2822]: I0213 16:04:47.017780 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33" Feb 13 16:04:47.018464 containerd[1557]: time="2025-02-13T16:04:47.018367792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:5,}" Feb 13 16:04:47.018555 containerd[1557]: time="2025-02-13T16:04:47.018545015Z" level=info msg="StopPodSandbox for \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\"" Feb 13 16:04:47.018813 containerd[1557]: time="2025-02-13T16:04:47.018802512Z" level=info msg="Ensure that sandbox fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33 in task-service has been cleanup successfully" Feb 13 16:04:47.019010 containerd[1557]: time="2025-02-13T16:04:47.018999811Z" level=info msg="TearDown network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" successfully" Feb 13 16:04:47.019064 containerd[1557]: time="2025-02-13T16:04:47.019056576Z" level=info msg="StopPodSandbox for \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" returns successfully" Feb 13 16:04:47.019846 containerd[1557]: time="2025-02-13T16:04:47.019675156Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 5.206905678s" Feb 13 16:04:47.019846 containerd[1557]: time="2025-02-13T16:04:47.019693922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 16:04:47.020370 containerd[1557]: time="2025-02-13T16:04:47.020231604Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\"" Feb 13 16:04:47.020525 containerd[1557]: time="2025-02-13T16:04:47.020473925Z" level=info msg="TearDown network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" successfully" Feb 13 16:04:47.020525 containerd[1557]: time="2025-02-13T16:04:47.020509281Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" returns successfully" Feb 13 16:04:47.021294 containerd[1557]: time="2025-02-13T16:04:47.021255140Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" Feb 13 16:04:47.021488 containerd[1557]: time="2025-02-13T16:04:47.021398672Z" level=info msg="TearDown network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" successfully" Feb 13 16:04:47.021488 containerd[1557]: time="2025-02-13T16:04:47.021407511Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" returns successfully" Feb 13 16:04:47.022105 containerd[1557]: time="2025-02-13T16:04:47.021591914Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:04:47.022185 containerd[1557]: time="2025-02-13T16:04:47.022165054Z" level=info msg="TearDown network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" successfully" Feb 13 16:04:47.022229 containerd[1557]: time="2025-02-13T16:04:47.022220348Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" returns successfully" Feb 13 16:04:47.022390 containerd[1557]: time="2025-02-13T16:04:47.022381122Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:04:47.022543 containerd[1557]: time="2025-02-13T16:04:47.022459760Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:04:47.022543 containerd[1557]: time="2025-02-13T16:04:47.022467656Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:04:47.022738 containerd[1557]: time="2025-02-13T16:04:47.022681022Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:04:47.022941 containerd[1557]: time="2025-02-13T16:04:47.022715618Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:04:47.022941 containerd[1557]: time="2025-02-13T16:04:47.022810555Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:04:47.023240 containerd[1557]: time="2025-02-13T16:04:47.023119335Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:6,}" Feb 13 16:04:47.217435 containerd[1557]: time="2025-02-13T16:04:47.216905727Z" level=info msg="CreateContainer within sandbox \"373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 16:04:47.236897 containerd[1557]: time="2025-02-13T16:04:47.236822466Z" level=error msg="Failed to destroy network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.237418 containerd[1557]: time="2025-02-13T16:04:47.237021179Z" level=error msg="Failed to destroy network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.237418 containerd[1557]: time="2025-02-13T16:04:47.237205156Z" level=error msg="encountered an error cleaning up failed sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.237418 containerd[1557]: time="2025-02-13T16:04:47.237237900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.237418 containerd[1557]: time="2025-02-13T16:04:47.237342570Z" level=error msg="encountered an error cleaning up failed sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.237418 containerd[1557]: time="2025-02-13T16:04:47.237364173Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.238251 kubelet[2822]: E0213 16:04:47.237613 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.238251 kubelet[2822]: E0213 16:04:47.237665 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:47.238251 kubelet[2822]: E0213 16:04:47.237680 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tndq9" Feb 13 16:04:47.238348 kubelet[2822]: E0213 16:04:47.237703 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tndq9_calico-system(4e58e6b2-6097-4f35-ba27-10ac9fc1ce49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tndq9" podUID="4e58e6b2-6097-4f35-ba27-10ac9fc1ce49" Feb 13 16:04:47.238348 kubelet[2822]: E0213 16:04:47.237727 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.238348 kubelet[2822]: E0213 16:04:47.237739 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:47.238430 kubelet[2822]: E0213 16:04:47.237746 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-x9p69" Feb 13 16:04:47.238430 kubelet[2822]: E0213 16:04:47.237767 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-x9p69_kube-system(4ece0321-0375-43da-8c1b-713662433b6a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-x9p69" podUID="4ece0321-0375-43da-8c1b-713662433b6a" Feb 13 16:04:47.245671 containerd[1557]: time="2025-02-13T16:04:47.245619580Z" level=error msg="Failed to destroy network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.245909 containerd[1557]: time="2025-02-13T16:04:47.245891727Z" level=error msg="encountered an error cleaning up failed sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.245948 containerd[1557]: time="2025-02-13T16:04:47.245925970Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.246242 containerd[1557]: time="2025-02-13T16:04:47.246226108Z" level=error msg="Failed to destroy network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.246343 kubelet[2822]: E0213 16:04:47.246311 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.246422 kubelet[2822]: E0213 16:04:47.246413 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:47.246479 kubelet[2822]: E0213 16:04:47.246470 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" Feb 13 16:04:47.246922 containerd[1557]: time="2025-02-13T16:04:47.246869096Z" level=error msg="encountered an error cleaning up failed sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.246957 kubelet[2822]: E0213 16:04:47.246897 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6cd7f7ffd8-hvv7z_calico-system(5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podUID="5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5" Feb 13 16:04:47.247511 containerd[1557]: time="2025-02-13T16:04:47.247069074Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.247692 kubelet[2822]: E0213 16:04:47.247596 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.247692 kubelet[2822]: E0213 16:04:47.247617 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:47.247692 kubelet[2822]: E0213 16:04:47.247628 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" Feb 13 16:04:47.247784 kubelet[2822]: E0213 16:04:47.247672 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tldgb_calico-apiserver(3d909c6e-816b-4c0f-b575-428d389c17b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podUID="3d909c6e-816b-4c0f-b575-428d389c17b0" Feb 13 16:04:47.274934 containerd[1557]: time="2025-02-13T16:04:47.274788642Z" level=info msg="CreateContainer within sandbox \"373d6ebdd86734fbb740fed62a0687ac6604392a63922262a746e3efe076d3ac\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"50e9a5492b452bb6747758b5b0a07d43fca89d8bafd7f417f6b8e17ccf8f3e47\"" Feb 13 16:04:47.274934 containerd[1557]: time="2025-02-13T16:04:47.274883106Z" level=error msg="Failed to destroy network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.275273 containerd[1557]: time="2025-02-13T16:04:47.275138361Z" level=error msg="encountered an error cleaning up failed sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.275273 containerd[1557]: time="2025-02-13T16:04:47.275168909Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.275273 containerd[1557]: time="2025-02-13T16:04:47.275235442Z" level=error msg="Failed to destroy network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.275593 kubelet[2822]: E0213 16:04:47.275447 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.275593 kubelet[2822]: E0213 16:04:47.275483 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:47.275593 kubelet[2822]: E0213 16:04:47.275500 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" Feb 13 16:04:47.275690 kubelet[2822]: E0213 16:04:47.275525 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-85f996756d-tfwqr_calico-apiserver(f8381855-f1fc-4a49-b9b0-17a3731a7463)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podUID="f8381855-f1fc-4a49-b9b0-17a3731a7463" Feb 13 16:04:47.276026 containerd[1557]: time="2025-02-13T16:04:47.275957480Z" level=error msg="encountered an error cleaning up failed sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.276026 containerd[1557]: time="2025-02-13T16:04:47.275982757Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.276117 kubelet[2822]: E0213 16:04:47.276099 2822 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 16:04:47.276145 kubelet[2822]: E0213 16:04:47.276128 2822 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:47.276168 kubelet[2822]: E0213 16:04:47.276147 2822 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-znr52" Feb 13 16:04:47.276199 kubelet[2822]: E0213 16:04:47.276181 2822 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-znr52_kube-system(7dfb39fc-a74d-477a-9521-d86660f1b99b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-znr52" podUID="7dfb39fc-a74d-477a-9521-d86660f1b99b" Feb 13 16:04:47.279203 containerd[1557]: time="2025-02-13T16:04:47.279186405Z" level=info msg="StartContainer for \"50e9a5492b452bb6747758b5b0a07d43fca89d8bafd7f417f6b8e17ccf8f3e47\"" Feb 13 16:04:47.337733 systemd[1]: Started cri-containerd-50e9a5492b452bb6747758b5b0a07d43fca89d8bafd7f417f6b8e17ccf8f3e47.scope - libcontainer container 50e9a5492b452bb6747758b5b0a07d43fca89d8bafd7f417f6b8e17ccf8f3e47. Feb 13 16:04:47.361847 containerd[1557]: time="2025-02-13T16:04:47.361803089Z" level=info msg="StartContainer for \"50e9a5492b452bb6747758b5b0a07d43fca89d8bafd7f417f6b8e17ccf8f3e47\" returns successfully" Feb 13 16:04:47.500988 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 16:04:47.502509 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 16:04:47.998579 systemd[1]: run-netns-cni\x2d4ad0d77c\x2d1bfd\x2d0752\x2d76ab\x2d136218f9d8e4.mount: Deactivated successfully. Feb 13 16:04:48.034439 kubelet[2822]: I0213 16:04:48.034424 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0" Feb 13 16:04:48.041969 containerd[1557]: time="2025-02-13T16:04:48.040853720Z" level=info msg="StopPodSandbox for \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\"" Feb 13 16:04:48.041969 containerd[1557]: time="2025-02-13T16:04:48.041017772Z" level=info msg="Ensure that sandbox 4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0 in task-service has been cleanup successfully" Feb 13 16:04:48.041969 containerd[1557]: time="2025-02-13T16:04:48.041790672Z" level=info msg="TearDown network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\" successfully" Feb 13 16:04:48.041969 containerd[1557]: time="2025-02-13T16:04:48.041803052Z" level=info msg="StopPodSandbox for \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\" returns successfully" Feb 13 16:04:48.042281 containerd[1557]: time="2025-02-13T16:04:48.042000060Z" level=info msg="StopPodSandbox for \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\"" Feb 13 16:04:48.042281 containerd[1557]: time="2025-02-13T16:04:48.042053139Z" level=info msg="TearDown network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" successfully" Feb 13 16:04:48.042281 containerd[1557]: time="2025-02-13T16:04:48.042061046Z" level=info msg="StopPodSandbox for \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" returns successfully" Feb 13 16:04:48.043451 containerd[1557]: time="2025-02-13T16:04:48.043300350Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\"" Feb 13 16:04:48.043451 containerd[1557]: time="2025-02-13T16:04:48.043359386Z" level=info msg="TearDown network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" successfully" Feb 13 16:04:48.043451 containerd[1557]: time="2025-02-13T16:04:48.043368174Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" returns successfully" Feb 13 16:04:48.044596 systemd[1]: run-netns-cni\x2d0640e0af\x2d264b\x2d02e5\x2dcea7\x2d07d63a076c74.mount: Deactivated successfully. Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.044893480Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.044950488Z" level=info msg="TearDown network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" successfully" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.044959222Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" returns successfully" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.045213112Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.045260389Z" level=info msg="TearDown network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" successfully" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.045268685Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" returns successfully" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.045390578Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.045448597Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:04:48.046113 containerd[1557]: time="2025-02-13T16:04:48.046020497Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:04:48.046527 containerd[1557]: time="2025-02-13T16:04:48.046426721Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:04:48.046527 containerd[1557]: time="2025-02-13T16:04:48.046490096Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:04:48.046527 containerd[1557]: time="2025-02-13T16:04:48.046499310Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:04:48.046848 kubelet[2822]: I0213 16:04:48.046830 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444" Feb 13 16:04:48.047932 containerd[1557]: time="2025-02-13T16:04:48.047414236Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:7,}" Feb 13 16:04:48.049296 containerd[1557]: time="2025-02-13T16:04:48.049271521Z" level=info msg="StopPodSandbox for \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\"" Feb 13 16:04:48.049616 containerd[1557]: time="2025-02-13T16:04:48.049597461Z" level=info msg="Ensure that sandbox d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444 in task-service has been cleanup successfully" Feb 13 16:04:48.051725 systemd[1]: run-netns-cni\x2dc0921894\x2dba28\x2dc244\x2d03cb\x2d5434c03a9d6d.mount: Deactivated successfully. Feb 13 16:04:48.052228 containerd[1557]: time="2025-02-13T16:04:48.051964638Z" level=info msg="TearDown network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\" successfully" Feb 13 16:04:48.052228 containerd[1557]: time="2025-02-13T16:04:48.051974865Z" level=info msg="StopPodSandbox for \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\" returns successfully" Feb 13 16:04:48.052228 containerd[1557]: time="2025-02-13T16:04:48.052197599Z" level=info msg="StopPodSandbox for \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\"" Feb 13 16:04:48.052293 containerd[1557]: time="2025-02-13T16:04:48.052270263Z" level=info msg="TearDown network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" successfully" Feb 13 16:04:48.052293 containerd[1557]: time="2025-02-13T16:04:48.052278755Z" level=info msg="StopPodSandbox for \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" returns successfully" Feb 13 16:04:48.052529 containerd[1557]: time="2025-02-13T16:04:48.052510434Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\"" Feb 13 16:04:48.053019 containerd[1557]: time="2025-02-13T16:04:48.052557350Z" level=info msg="TearDown network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" successfully" Feb 13 16:04:48.053019 containerd[1557]: time="2025-02-13T16:04:48.052879515Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" returns successfully" Feb 13 16:04:48.054348 kubelet[2822]: I0213 16:04:48.053432 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945" Feb 13 16:04:48.054485 containerd[1557]: time="2025-02-13T16:04:48.054366799Z" level=info msg="StopPodSandbox for \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\"" Feb 13 16:04:48.055373 containerd[1557]: time="2025-02-13T16:04:48.054423234Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" Feb 13 16:04:48.055373 containerd[1557]: time="2025-02-13T16:04:48.055349281Z" level=info msg="TearDown network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" successfully" Feb 13 16:04:48.055758 containerd[1557]: time="2025-02-13T16:04:48.055358137Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" returns successfully" Feb 13 16:04:48.055974 containerd[1557]: time="2025-02-13T16:04:48.055157102Z" level=info msg="Ensure that sandbox b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945 in task-service has been cleanup successfully" Feb 13 16:04:48.056371 containerd[1557]: time="2025-02-13T16:04:48.056237087Z" level=info msg="TearDown network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\" successfully" Feb 13 16:04:48.056371 containerd[1557]: time="2025-02-13T16:04:48.056247270Z" level=info msg="StopPodSandbox for \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\" returns successfully" Feb 13 16:04:48.058450 containerd[1557]: time="2025-02-13T16:04:48.056663181Z" level=info msg="StopPodSandbox for \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\"" Feb 13 16:04:48.058450 containerd[1557]: time="2025-02-13T16:04:48.056703263Z" level=info msg="TearDown network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" successfully" Feb 13 16:04:48.058450 containerd[1557]: time="2025-02-13T16:04:48.056710290Z" level=info msg="StopPodSandbox for \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" returns successfully" Feb 13 16:04:48.058450 containerd[1557]: time="2025-02-13T16:04:48.056728121Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:04:48.058450 containerd[1557]: time="2025-02-13T16:04:48.056765051Z" level=info msg="TearDown network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" successfully" Feb 13 16:04:48.058450 containerd[1557]: time="2025-02-13T16:04:48.056771915Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" returns successfully" Feb 13 16:04:48.057627 systemd[1]: run-netns-cni\x2d2aa56e1a\x2db177\x2dc387\x2d1f1e\x2d8a55df56e5eb.mount: Deactivated successfully. Feb 13 16:04:48.058972 containerd[1557]: time="2025-02-13T16:04:48.058787956Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\"" Feb 13 16:04:48.058972 containerd[1557]: time="2025-02-13T16:04:48.058826991Z" level=info msg="TearDown network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" successfully" Feb 13 16:04:48.058972 containerd[1557]: time="2025-02-13T16:04:48.058833307Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" returns successfully" Feb 13 16:04:48.058972 containerd[1557]: time="2025-02-13T16:04:48.058872737Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:04:48.058972 containerd[1557]: time="2025-02-13T16:04:48.058904891Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:04:48.058972 containerd[1557]: time="2025-02-13T16:04:48.058910155Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:04:48.059542 containerd[1557]: time="2025-02-13T16:04:48.059101516Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:04:48.059542 containerd[1557]: time="2025-02-13T16:04:48.059181451Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" Feb 13 16:04:48.059542 containerd[1557]: time="2025-02-13T16:04:48.059213591Z" level=info msg="TearDown network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" successfully" Feb 13 16:04:48.059542 containerd[1557]: time="2025-02-13T16:04:48.059219017Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" returns successfully" Feb 13 16:04:48.059542 containerd[1557]: time="2025-02-13T16:04:48.059330298Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:04:48.059542 containerd[1557]: time="2025-02-13T16:04:48.059363626Z" level=info msg="TearDown network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" successfully" Feb 13 16:04:48.059542 containerd[1557]: time="2025-02-13T16:04:48.059369231Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" returns successfully" Feb 13 16:04:48.059737 containerd[1557]: time="2025-02-13T16:04:48.059727525Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:04:48.059783 containerd[1557]: time="2025-02-13T16:04:48.059776127Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:04:48.060387 containerd[1557]: time="2025-02-13T16:04:48.060370651Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:04:48.060615 containerd[1557]: time="2025-02-13T16:04:48.060515874Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:04:48.060615 containerd[1557]: time="2025-02-13T16:04:48.060525382Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:04:48.060615 containerd[1557]: time="2025-02-13T16:04:48.060586770Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:7,}" Feb 13 16:04:48.061318 kubelet[2822]: I0213 16:04:48.060862 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb" Feb 13 16:04:48.063339 containerd[1557]: time="2025-02-13T16:04:48.061609855Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:6,}" Feb 13 16:04:48.063339 containerd[1557]: time="2025-02-13T16:04:48.061711546Z" level=info msg="StopPodSandbox for \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\"" Feb 13 16:04:48.063339 containerd[1557]: time="2025-02-13T16:04:48.061799997Z" level=info msg="Ensure that sandbox 5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb in task-service has been cleanup successfully" Feb 13 16:04:48.063125 systemd[1]: run-netns-cni\x2d3370711e\x2d2c94\x2de5a5\x2d5374\x2d7fd36ec04bae.mount: Deactivated successfully. Feb 13 16:04:48.064173 containerd[1557]: time="2025-02-13T16:04:48.064161483Z" level=info msg="TearDown network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\" successfully" Feb 13 16:04:48.064260 containerd[1557]: time="2025-02-13T16:04:48.064250873Z" level=info msg="StopPodSandbox for \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\" returns successfully" Feb 13 16:04:48.065655 containerd[1557]: time="2025-02-13T16:04:48.065513951Z" level=info msg="StopPodSandbox for \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\"" Feb 13 16:04:48.065655 containerd[1557]: time="2025-02-13T16:04:48.065565760Z" level=info msg="TearDown network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" successfully" Feb 13 16:04:48.065655 containerd[1557]: time="2025-02-13T16:04:48.065572553Z" level=info msg="StopPodSandbox for \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" returns successfully" Feb 13 16:04:48.075273 containerd[1557]: time="2025-02-13T16:04:48.075248057Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\"" Feb 13 16:04:48.075347 containerd[1557]: time="2025-02-13T16:04:48.075303048Z" level=info msg="TearDown network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" successfully" Feb 13 16:04:48.075347 containerd[1557]: time="2025-02-13T16:04:48.075309582Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" returns successfully" Feb 13 16:04:48.077402 containerd[1557]: time="2025-02-13T16:04:48.076864155Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" Feb 13 16:04:48.077716 containerd[1557]: time="2025-02-13T16:04:48.077705491Z" level=info msg="TearDown network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" successfully" Feb 13 16:04:48.077768 containerd[1557]: time="2025-02-13T16:04:48.077759739Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" returns successfully" Feb 13 16:04:48.078441 containerd[1557]: time="2025-02-13T16:04:48.078422369Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:04:48.078476 containerd[1557]: time="2025-02-13T16:04:48.078462507Z" level=info msg="TearDown network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" successfully" Feb 13 16:04:48.078476 containerd[1557]: time="2025-02-13T16:04:48.078469293Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" returns successfully" Feb 13 16:04:48.080576 containerd[1557]: time="2025-02-13T16:04:48.080561002Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:04:48.081034 containerd[1557]: time="2025-02-13T16:04:48.080611649Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:04:48.081034 containerd[1557]: time="2025-02-13T16:04:48.080621911Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:04:48.082232 containerd[1557]: time="2025-02-13T16:04:48.082220018Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:6,}" Feb 13 16:04:48.084886 kubelet[2822]: I0213 16:04:48.082877 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154" Feb 13 16:04:48.085996 containerd[1557]: time="2025-02-13T16:04:48.085979049Z" level=info msg="StopPodSandbox for \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\"" Feb 13 16:04:48.086311 containerd[1557]: time="2025-02-13T16:04:48.086220751Z" level=info msg="Ensure that sandbox b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154 in task-service has been cleanup successfully" Feb 13 16:04:48.086637 containerd[1557]: time="2025-02-13T16:04:48.086624713Z" level=info msg="TearDown network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\" successfully" Feb 13 16:04:48.086777 containerd[1557]: time="2025-02-13T16:04:48.086767404Z" level=info msg="StopPodSandbox for \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\" returns successfully" Feb 13 16:04:48.096777 containerd[1557]: time="2025-02-13T16:04:48.096756037Z" level=info msg="StopPodSandbox for \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\"" Feb 13 16:04:48.097136 containerd[1557]: time="2025-02-13T16:04:48.097124880Z" level=info msg="TearDown network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" successfully" Feb 13 16:04:48.097981 containerd[1557]: time="2025-02-13T16:04:48.097967769Z" level=info msg="StopPodSandbox for \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" returns successfully" Feb 13 16:04:48.099183 containerd[1557]: time="2025-02-13T16:04:48.098880218Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\"" Feb 13 16:04:48.099530 containerd[1557]: time="2025-02-13T16:04:48.099216047Z" level=info msg="TearDown network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" successfully" Feb 13 16:04:48.100141 containerd[1557]: time="2025-02-13T16:04:48.099224287Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" returns successfully" Feb 13 16:04:48.100141 containerd[1557]: time="2025-02-13T16:04:48.099798929Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" Feb 13 16:04:48.101150 containerd[1557]: time="2025-02-13T16:04:48.101138767Z" level=info msg="TearDown network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" successfully" Feb 13 16:04:48.101246 containerd[1557]: time="2025-02-13T16:04:48.101236672Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" returns successfully" Feb 13 16:04:48.101538 kubelet[2822]: I0213 16:04:48.101524 2822 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7" Feb 13 16:04:48.103209 containerd[1557]: time="2025-02-13T16:04:48.103190582Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:04:48.104341 containerd[1557]: time="2025-02-13T16:04:48.103283664Z" level=info msg="StopPodSandbox for \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\"" Feb 13 16:04:48.104341 containerd[1557]: time="2025-02-13T16:04:48.103417215Z" level=info msg="Ensure that sandbox 05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7 in task-service has been cleanup successfully" Feb 13 16:04:48.104341 containerd[1557]: time="2025-02-13T16:04:48.103780502Z" level=info msg="TearDown network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" successfully" Feb 13 16:04:48.104341 containerd[1557]: time="2025-02-13T16:04:48.103789752Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" returns successfully" Feb 13 16:04:48.104341 containerd[1557]: time="2025-02-13T16:04:48.104169041Z" level=info msg="TearDown network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\" successfully" Feb 13 16:04:48.104341 containerd[1557]: time="2025-02-13T16:04:48.104177874Z" level=info msg="StopPodSandbox for \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\" returns successfully" Feb 13 16:04:48.104341 containerd[1557]: time="2025-02-13T16:04:48.104218755Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:04:48.104544 containerd[1557]: time="2025-02-13T16:04:48.104498679Z" level=info msg="StopPodSandbox for \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\"" Feb 13 16:04:48.105036 containerd[1557]: time="2025-02-13T16:04:48.104613870Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:04:48.105036 containerd[1557]: time="2025-02-13T16:04:48.104624301Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:04:48.105036 containerd[1557]: time="2025-02-13T16:04:48.104710988Z" level=info msg="TearDown network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" successfully" Feb 13 16:04:48.105036 containerd[1557]: time="2025-02-13T16:04:48.104717492Z" level=info msg="StopPodSandbox for \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" returns successfully" Feb 13 16:04:48.105360 containerd[1557]: time="2025-02-13T16:04:48.105169696Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\"" Feb 13 16:04:48.105360 containerd[1557]: time="2025-02-13T16:04:48.105249241Z" level=info msg="TearDown network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" successfully" Feb 13 16:04:48.105360 containerd[1557]: time="2025-02-13T16:04:48.105259002Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" returns successfully" Feb 13 16:04:48.105602 containerd[1557]: time="2025-02-13T16:04:48.105578699Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:04:48.105672 containerd[1557]: time="2025-02-13T16:04:48.105660929Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" Feb 13 16:04:48.105715 containerd[1557]: time="2025-02-13T16:04:48.105702652Z" level=info msg="TearDown network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" successfully" Feb 13 16:04:48.105715 containerd[1557]: time="2025-02-13T16:04:48.105711987Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" returns successfully" Feb 13 16:04:48.105839 containerd[1557]: time="2025-02-13T16:04:48.105828875Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:04:48.106369 containerd[1557]: time="2025-02-13T16:04:48.106357969Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:04:48.108131 containerd[1557]: time="2025-02-13T16:04:48.108119863Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:04:48.108478 containerd[1557]: time="2025-02-13T16:04:48.108467893Z" level=info msg="TearDown network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" successfully" Feb 13 16:04:48.108579 containerd[1557]: time="2025-02-13T16:04:48.108570476Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" returns successfully" Feb 13 16:04:48.108721 containerd[1557]: time="2025-02-13T16:04:48.108621128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:7,}" Feb 13 16:04:48.109279 containerd[1557]: time="2025-02-13T16:04:48.108947918Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:04:48.109377 containerd[1557]: time="2025-02-13T16:04:48.109367798Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:04:48.114280 containerd[1557]: time="2025-02-13T16:04:48.113815498Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:04:48.122267 containerd[1557]: time="2025-02-13T16:04:48.122242319Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:04:48.122343 containerd[1557]: time="2025-02-13T16:04:48.122325234Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:04:48.122343 containerd[1557]: time="2025-02-13T16:04:48.122333001Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:04:48.123810 containerd[1557]: time="2025-02-13T16:04:48.123792913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:7,}" Feb 13 16:04:48.460567 systemd-networkd[1483]: calib9924be8bb8: Link UP Feb 13 16:04:48.461092 systemd-networkd[1483]: calib9924be8bb8: Gained carrier Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.175 [INFO][4896] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.189 [INFO][4896] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--tndq9-eth0 csi-node-driver- calico-system 4e58e6b2-6097-4f35-ba27-10ac9fc1ce49 576 0 2025-02-13 16:04:31 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:84cddb44f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-tndq9 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib9924be8bb8 [] []}} ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.189 [INFO][4896] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-eth0" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.412 [INFO][4929] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" HandleID="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Workload="localhost-k8s-csi--node--driver--tndq9-eth0" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.431 [INFO][4929] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" HandleID="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Workload="localhost-k8s-csi--node--driver--tndq9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003e14f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-tndq9", "timestamp":"2025-02-13 16:04:48.412841323 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.431 [INFO][4929] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.431 [INFO][4929] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.431 [INFO][4929] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.433 [INFO][4929] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.439 [INFO][4929] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.442 [INFO][4929] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.443 [INFO][4929] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.444 [INFO][4929] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.444 [INFO][4929] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.444 [INFO][4929] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6 Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.446 [INFO][4929] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.449 [INFO][4929] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.449 [INFO][4929] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" host="localhost" Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.449 [INFO][4929] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:04:48.466939 containerd[1557]: 2025-02-13 16:04:48.449 [INFO][4929] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" HandleID="k8s-pod-network.ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Workload="localhost-k8s-csi--node--driver--tndq9-eth0" Feb 13 16:04:48.469228 containerd[1557]: 2025-02-13 16:04:48.451 [INFO][4896] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--tndq9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4e58e6b2-6097-4f35-ba27-10ac9fc1ce49", ResourceVersion:"576", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-tndq9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib9924be8bb8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.469228 containerd[1557]: 2025-02-13 16:04:48.451 [INFO][4896] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-eth0" Feb 13 16:04:48.469228 containerd[1557]: 2025-02-13 16:04:48.451 [INFO][4896] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib9924be8bb8 ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-eth0" Feb 13 16:04:48.469228 containerd[1557]: 2025-02-13 16:04:48.459 [INFO][4896] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-eth0" Feb 13 16:04:48.469228 containerd[1557]: 2025-02-13 16:04:48.460 [INFO][4896] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--tndq9-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4e58e6b2-6097-4f35-ba27-10ac9fc1ce49", ResourceVersion:"576", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"84cddb44f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6", Pod:"csi-node-driver-tndq9", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib9924be8bb8", MAC:"6e:0f:77:86:56:dc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.469228 containerd[1557]: 2025-02-13 16:04:48.465 [INFO][4896] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6" Namespace="calico-system" Pod="csi-node-driver-tndq9" WorkloadEndpoint="localhost-k8s-csi--node--driver--tndq9-eth0" Feb 13 16:04:48.497450 containerd[1557]: time="2025-02-13T16:04:48.497235176Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:48.497450 containerd[1557]: time="2025-02-13T16:04:48.497275942Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:48.497450 containerd[1557]: time="2025-02-13T16:04:48.497286479Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.497450 containerd[1557]: time="2025-02-13T16:04:48.497342084Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.513773 systemd[1]: Started cri-containerd-ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6.scope - libcontainer container ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6. Feb 13 16:04:48.531064 systemd-resolved[1422]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:04:48.533349 kubelet[2822]: I0213 16:04:48.521021 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xhs2q" podStartSLOduration=1.9218832639999999 podStartE2EDuration="17.468663492s" podCreationTimestamp="2025-02-13 16:04:31 +0000 UTC" firstStartedPulling="2025-02-13 16:04:31.473536739 +0000 UTC m=+10.892725407" lastFinishedPulling="2025-02-13 16:04:47.020316963 +0000 UTC m=+26.439505635" observedRunningTime="2025-02-13 16:04:48.095078698 +0000 UTC m=+27.514267371" watchObservedRunningTime="2025-02-13 16:04:48.468663492 +0000 UTC m=+27.887852165" Feb 13 16:04:48.545383 containerd[1557]: time="2025-02-13T16:04:48.545337656Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tndq9,Uid:4e58e6b2-6097-4f35-ba27-10ac9fc1ce49,Namespace:calico-system,Attempt:7,} returns sandbox id \"ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6\"" Feb 13 16:04:48.550607 containerd[1557]: time="2025-02-13T16:04:48.549855234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 16:04:48.563241 systemd-networkd[1483]: cali168497c2b6c: Link UP Feb 13 16:04:48.563520 systemd-networkd[1483]: cali168497c2b6c: Gained carrier Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.192 [INFO][4912] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.198 [INFO][4912] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0 calico-apiserver-85f996756d- calico-apiserver f8381855-f1fc-4a49-b9b0-17a3731a7463 659 0 2025-02-13 16:04:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85f996756d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-85f996756d-tfwqr eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali168497c2b6c [] []}} ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.199 [INFO][4912] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.412 [INFO][4931] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" HandleID="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Workload="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.432 [INFO][4931] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" HandleID="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Workload="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004a08f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-85f996756d-tfwqr", "timestamp":"2025-02-13 16:04:48.412780623 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.432 [INFO][4931] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.449 [INFO][4931] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.449 [INFO][4931] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.534 [INFO][4931] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.539 [INFO][4931] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.546 [INFO][4931] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.548 [INFO][4931] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.551 [INFO][4931] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.551 [INFO][4931] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.553 [INFO][4931] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886 Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.556 [INFO][4931] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.559 [INFO][4931] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.559 [INFO][4931] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" host="localhost" Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.559 [INFO][4931] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:04:48.572829 containerd[1557]: 2025-02-13 16:04:48.559 [INFO][4931] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" HandleID="k8s-pod-network.749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Workload="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" Feb 13 16:04:48.573808 containerd[1557]: 2025-02-13 16:04:48.561 [INFO][4912] cni-plugin/k8s.go 386: Populated endpoint ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0", GenerateName:"calico-apiserver-85f996756d-", Namespace:"calico-apiserver", SelfLink:"", UID:"f8381855-f1fc-4a49-b9b0-17a3731a7463", ResourceVersion:"659", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85f996756d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-85f996756d-tfwqr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali168497c2b6c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.573808 containerd[1557]: 2025-02-13 16:04:48.562 [INFO][4912] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" Feb 13 16:04:48.573808 containerd[1557]: 2025-02-13 16:04:48.562 [INFO][4912] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali168497c2b6c ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" Feb 13 16:04:48.573808 containerd[1557]: 2025-02-13 16:04:48.563 [INFO][4912] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" Feb 13 16:04:48.573808 containerd[1557]: 2025-02-13 16:04:48.563 [INFO][4912] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0", GenerateName:"calico-apiserver-85f996756d-", Namespace:"calico-apiserver", SelfLink:"", UID:"f8381855-f1fc-4a49-b9b0-17a3731a7463", ResourceVersion:"659", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85f996756d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886", Pod:"calico-apiserver-85f996756d-tfwqr", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali168497c2b6c", MAC:"92:02:63:15:25:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.573808 containerd[1557]: 2025-02-13 16:04:48.570 [INFO][4912] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tfwqr" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tfwqr-eth0" Feb 13 16:04:48.590070 containerd[1557]: time="2025-02-13T16:04:48.589883362Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:48.590070 containerd[1557]: time="2025-02-13T16:04:48.589931036Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:48.590070 containerd[1557]: time="2025-02-13T16:04:48.589946781Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.590070 containerd[1557]: time="2025-02-13T16:04:48.590023049Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.600729 systemd[1]: Started cri-containerd-749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886.scope - libcontainer container 749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886. Feb 13 16:04:48.609857 systemd-resolved[1422]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:04:48.631486 containerd[1557]: time="2025-02-13T16:04:48.631467241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tfwqr,Uid:f8381855-f1fc-4a49-b9b0-17a3731a7463,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886\"" Feb 13 16:04:48.654750 systemd-networkd[1483]: cali395e6fe55d3: Link UP Feb 13 16:04:48.656026 systemd-networkd[1483]: cali395e6fe55d3: Gained carrier Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.102 [INFO][4853] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.166 [INFO][4853] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0 calico-apiserver-85f996756d- calico-apiserver 3d909c6e-816b-4c0f-b575-428d389c17b0 662 0 2025-02-13 16:04:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:85f996756d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-85f996756d-tldgb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali395e6fe55d3 [] []}} ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.166 [INFO][4853] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.412 [INFO][4926] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" HandleID="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Workload="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.432 [INFO][4926] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" HandleID="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Workload="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002549e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-85f996756d-tldgb", "timestamp":"2025-02-13 16:04:48.412904314 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.432 [INFO][4926] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.559 [INFO][4926] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.559 [INFO][4926] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.634 [INFO][4926] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.639 [INFO][4926] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.644 [INFO][4926] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.645 [INFO][4926] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.646 [INFO][4926] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.646 [INFO][4926] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.647 [INFO][4926] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451 Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.649 [INFO][4926] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.651 [INFO][4926] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.651 [INFO][4926] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" host="localhost" Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.651 [INFO][4926] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:04:48.663678 containerd[1557]: 2025-02-13 16:04:48.651 [INFO][4926] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" HandleID="k8s-pod-network.faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Workload="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" Feb 13 16:04:48.664171 containerd[1557]: 2025-02-13 16:04:48.653 [INFO][4853] cni-plugin/k8s.go 386: Populated endpoint ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0", GenerateName:"calico-apiserver-85f996756d-", Namespace:"calico-apiserver", SelfLink:"", UID:"3d909c6e-816b-4c0f-b575-428d389c17b0", ResourceVersion:"662", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85f996756d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-85f996756d-tldgb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali395e6fe55d3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.664171 containerd[1557]: 2025-02-13 16:04:48.653 [INFO][4853] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" Feb 13 16:04:48.664171 containerd[1557]: 2025-02-13 16:04:48.653 [INFO][4853] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali395e6fe55d3 ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" Feb 13 16:04:48.664171 containerd[1557]: 2025-02-13 16:04:48.655 [INFO][4853] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" Feb 13 16:04:48.664171 containerd[1557]: 2025-02-13 16:04:48.655 [INFO][4853] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0", GenerateName:"calico-apiserver-85f996756d-", Namespace:"calico-apiserver", SelfLink:"", UID:"3d909c6e-816b-4c0f-b575-428d389c17b0", ResourceVersion:"662", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"85f996756d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451", Pod:"calico-apiserver-85f996756d-tldgb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali395e6fe55d3", MAC:"a2:9c:40:a0:d2:7e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.664171 containerd[1557]: 2025-02-13 16:04:48.662 [INFO][4853] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451" Namespace="calico-apiserver" Pod="calico-apiserver-85f996756d-tldgb" WorkloadEndpoint="localhost-k8s-calico--apiserver--85f996756d--tldgb-eth0" Feb 13 16:04:48.677902 containerd[1557]: time="2025-02-13T16:04:48.677839572Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:48.678066 containerd[1557]: time="2025-02-13T16:04:48.677879816Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:48.678066 containerd[1557]: time="2025-02-13T16:04:48.677900503Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.678066 containerd[1557]: time="2025-02-13T16:04:48.677967314Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.692759 systemd[1]: Started cri-containerd-faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451.scope - libcontainer container faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451. Feb 13 16:04:48.702172 systemd-resolved[1422]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:04:48.720556 containerd[1557]: time="2025-02-13T16:04:48.720465277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-85f996756d-tldgb,Uid:3d909c6e-816b-4c0f-b575-428d389c17b0,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451\"" Feb 13 16:04:48.761817 systemd-networkd[1483]: cali869023cc7df: Link UP Feb 13 16:04:48.762921 systemd-networkd[1483]: cali869023cc7df: Gained carrier Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.129 [INFO][4864] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.165 [INFO][4864] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--x9p69-eth0 coredns-668d6bf9bc- kube-system 4ece0321-0375-43da-8c1b-713662433b6a 664 0 2025-02-13 16:04:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-x9p69 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali869023cc7df [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.165 [INFO][4864] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.412 [INFO][4925] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" HandleID="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Workload="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.432 [INFO][4925] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" HandleID="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Workload="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00023bba0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-x9p69", "timestamp":"2025-02-13 16:04:48.412888797 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.432 [INFO][4925] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.651 [INFO][4925] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.651 [INFO][4925] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.736 [INFO][4925] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.741 [INFO][4925] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.745 [INFO][4925] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.747 [INFO][4925] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.748 [INFO][4925] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.748 [INFO][4925] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.750 [INFO][4925] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.754 [INFO][4925] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.757 [INFO][4925] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.757 [INFO][4925] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" host="localhost" Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.757 [INFO][4925] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:04:48.773826 containerd[1557]: 2025-02-13 16:04:48.757 [INFO][4925] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" HandleID="k8s-pod-network.f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Workload="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" Feb 13 16:04:48.774974 containerd[1557]: 2025-02-13 16:04:48.759 [INFO][4864] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--x9p69-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4ece0321-0375-43da-8c1b-713662433b6a", ResourceVersion:"664", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-x9p69", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali869023cc7df", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.774974 containerd[1557]: 2025-02-13 16:04:48.759 [INFO][4864] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" Feb 13 16:04:48.774974 containerd[1557]: 2025-02-13 16:04:48.759 [INFO][4864] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali869023cc7df ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" Feb 13 16:04:48.774974 containerd[1557]: 2025-02-13 16:04:48.763 [INFO][4864] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" Feb 13 16:04:48.774974 containerd[1557]: 2025-02-13 16:04:48.763 [INFO][4864] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--x9p69-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4ece0321-0375-43da-8c1b-713662433b6a", ResourceVersion:"664", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd", Pod:"coredns-668d6bf9bc-x9p69", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali869023cc7df", MAC:"d2:a0:43:58:74:0d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.774974 containerd[1557]: 2025-02-13 16:04:48.772 [INFO][4864] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd" Namespace="kube-system" Pod="coredns-668d6bf9bc-x9p69" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--x9p69-eth0" Feb 13 16:04:48.794800 containerd[1557]: time="2025-02-13T16:04:48.794756557Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:48.794897 containerd[1557]: time="2025-02-13T16:04:48.794790375Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:48.794897 containerd[1557]: time="2025-02-13T16:04:48.794801379Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.794897 containerd[1557]: time="2025-02-13T16:04:48.794848137Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.820916 systemd[1]: Started cri-containerd-f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd.scope - libcontainer container f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd. Feb 13 16:04:48.830043 systemd-resolved[1422]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:04:48.861770 containerd[1557]: time="2025-02-13T16:04:48.861747751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-x9p69,Uid:4ece0321-0375-43da-8c1b-713662433b6a,Namespace:kube-system,Attempt:6,} returns sandbox id \"f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd\"" Feb 13 16:04:48.865559 containerd[1557]: time="2025-02-13T16:04:48.865543561Z" level=info msg="CreateContainer within sandbox \"f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 16:04:48.881981 systemd-networkd[1483]: calia99eb54da63: Link UP Feb 13 16:04:48.882542 systemd-networkd[1483]: calia99eb54da63: Gained carrier Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.145 [INFO][4883] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.167 [INFO][4883] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--668d6bf9bc--znr52-eth0 coredns-668d6bf9bc- kube-system 7dfb39fc-a74d-477a-9521-d86660f1b99b 655 0 2025-02-13 16:04:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-668d6bf9bc-znr52 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia99eb54da63 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.167 [INFO][4883] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.412 [INFO][4927] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" HandleID="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Workload="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.432 [INFO][4927] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" HandleID="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Workload="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039b5d0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-668d6bf9bc-znr52", "timestamp":"2025-02-13 16:04:48.412712373 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.433 [INFO][4927] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.757 [INFO][4927] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.757 [INFO][4927] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.836 [INFO][4927] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.841 [INFO][4927] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.845 [INFO][4927] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.846 [INFO][4927] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.847 [INFO][4927] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.847 [INFO][4927] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.851 [INFO][4927] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.856 [INFO][4927] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.861 [INFO][4927] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.861 [INFO][4927] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" host="localhost" Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.861 [INFO][4927] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:04:48.893976 containerd[1557]: 2025-02-13 16:04:48.861 [INFO][4927] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" HandleID="k8s-pod-network.4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Workload="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" Feb 13 16:04:48.895369 containerd[1557]: 2025-02-13 16:04:48.868 [INFO][4883] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--znr52-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7dfb39fc-a74d-477a-9521-d86660f1b99b", ResourceVersion:"655", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-668d6bf9bc-znr52", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia99eb54da63", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.895369 containerd[1557]: 2025-02-13 16:04:48.869 [INFO][4883] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" Feb 13 16:04:48.895369 containerd[1557]: 2025-02-13 16:04:48.869 [INFO][4883] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia99eb54da63 ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" Feb 13 16:04:48.895369 containerd[1557]: 2025-02-13 16:04:48.886 [INFO][4883] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" Feb 13 16:04:48.895369 containerd[1557]: 2025-02-13 16:04:48.886 [INFO][4883] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--668d6bf9bc--znr52-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"7dfb39fc-a74d-477a-9521-d86660f1b99b", ResourceVersion:"655", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b", Pod:"coredns-668d6bf9bc-znr52", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia99eb54da63", MAC:"e2:f4:b5:8b:6d:ac", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:48.895369 containerd[1557]: 2025-02-13 16:04:48.891 [INFO][4883] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b" Namespace="kube-system" Pod="coredns-668d6bf9bc-znr52" WorkloadEndpoint="localhost-k8s-coredns--668d6bf9bc--znr52-eth0" Feb 13 16:04:48.929698 containerd[1557]: time="2025-02-13T16:04:48.929619714Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:48.929802 containerd[1557]: time="2025-02-13T16:04:48.929688318Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:48.929802 containerd[1557]: time="2025-02-13T16:04:48.929699737Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.929802 containerd[1557]: time="2025-02-13T16:04:48.929760962Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:48.943674 containerd[1557]: time="2025-02-13T16:04:48.943542802Z" level=info msg="CreateContainer within sandbox \"f83b129e828995d6bcf965c834deca4c5b87405af2f962e084c7af23b3d8c9cd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"481052526a7022bb31c9b310c6167ec3741333f5aeb827e023ff6b56896dc124\"" Feb 13 16:04:48.944219 containerd[1557]: time="2025-02-13T16:04:48.944094497Z" level=info msg="StartContainer for \"481052526a7022bb31c9b310c6167ec3741333f5aeb827e023ff6b56896dc124\"" Feb 13 16:04:48.945757 systemd[1]: Started cri-containerd-4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b.scope - libcontainer container 4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b. Feb 13 16:04:48.957730 systemd-resolved[1422]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:04:48.976781 systemd[1]: Started cri-containerd-481052526a7022bb31c9b310c6167ec3741333f5aeb827e023ff6b56896dc124.scope - libcontainer container 481052526a7022bb31c9b310c6167ec3741333f5aeb827e023ff6b56896dc124. Feb 13 16:04:48.983810 systemd-networkd[1483]: cali39dae87c33d: Link UP Feb 13 16:04:48.983941 systemd-networkd[1483]: cali39dae87c33d: Gained carrier Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.140 [INFO][4862] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.166 [INFO][4862] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0 calico-kube-controllers-6cd7f7ffd8- calico-system 5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5 663 0 2025-02-13 16:04:31 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6cd7f7ffd8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-6cd7f7ffd8-hvv7z eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali39dae87c33d [] []}} ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.166 [INFO][4862] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.412 [INFO][4924] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" HandleID="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Workload="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.433 [INFO][4924] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" HandleID="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Workload="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00025c980), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-6cd7f7ffd8-hvv7z", "timestamp":"2025-02-13 16:04:48.412840558 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.433 [INFO][4924] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.862 [INFO][4924] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.862 [INFO][4924] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.939 [INFO][4924] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.950 [INFO][4924] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.956 [INFO][4924] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.957 [INFO][4924] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.959 [INFO][4924] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.959 [INFO][4924] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.960 [INFO][4924] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.965 [INFO][4924] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.974 [INFO][4924] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.976 [INFO][4924] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" host="localhost" Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.976 [INFO][4924] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 16:04:49.002721 containerd[1557]: 2025-02-13 16:04:48.976 [INFO][4924] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" HandleID="k8s-pod-network.c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Workload="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" Feb 13 16:04:49.005132 containerd[1557]: 2025-02-13 16:04:48.980 [INFO][4862] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0", GenerateName:"calico-kube-controllers-6cd7f7ffd8-", Namespace:"calico-system", SelfLink:"", UID:"5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5", ResourceVersion:"663", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6cd7f7ffd8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-6cd7f7ffd8-hvv7z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali39dae87c33d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:49.005132 containerd[1557]: 2025-02-13 16:04:48.981 [INFO][4862] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" Feb 13 16:04:49.005132 containerd[1557]: 2025-02-13 16:04:48.981 [INFO][4862] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali39dae87c33d ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" Feb 13 16:04:49.005132 containerd[1557]: 2025-02-13 16:04:48.983 [INFO][4862] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" Feb 13 16:04:49.005132 containerd[1557]: 2025-02-13 16:04:48.984 [INFO][4862] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0", GenerateName:"calico-kube-controllers-6cd7f7ffd8-", Namespace:"calico-system", SelfLink:"", UID:"5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5", ResourceVersion:"663", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 16, 4, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6cd7f7ffd8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc", Pod:"calico-kube-controllers-6cd7f7ffd8-hvv7z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali39dae87c33d", MAC:"06:64:70:d9:7b:7c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 16:04:49.005132 containerd[1557]: 2025-02-13 16:04:48.996 [INFO][4862] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc" Namespace="calico-system" Pod="calico-kube-controllers-6cd7f7ffd8-hvv7z" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--6cd7f7ffd8--hvv7z-eth0" Feb 13 16:04:49.009328 systemd[1]: run-netns-cni\x2d3bc7bf73\x2dd854\x2ddbc8\x2d3031\x2d30795d59cd63.mount: Deactivated successfully. Feb 13 16:04:49.009792 systemd[1]: run-netns-cni\x2db83ffece\x2dbed0\x2dec19\x2dedf3\x2d80286d55cb7e.mount: Deactivated successfully. Feb 13 16:04:49.021561 containerd[1557]: time="2025-02-13T16:04:49.021419608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-znr52,Uid:7dfb39fc-a74d-477a-9521-d86660f1b99b,Namespace:kube-system,Attempt:6,} returns sandbox id \"4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b\"" Feb 13 16:04:49.027791 containerd[1557]: time="2025-02-13T16:04:49.027771693Z" level=info msg="CreateContainer within sandbox \"4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 16:04:49.042330 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3935503499.mount: Deactivated successfully. Feb 13 16:04:49.043331 containerd[1557]: time="2025-02-13T16:04:49.043309435Z" level=info msg="CreateContainer within sandbox \"4906431a76b9fc43dde135a4817bdb3ab517abd8208e10b56f7374113361140b\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"874b7af5ae88aa8630f9250484dd40825fc355b173899f198fade621160bb2e3\"" Feb 13 16:04:49.044598 containerd[1557]: time="2025-02-13T16:04:49.044583244Z" level=info msg="StartContainer for \"874b7af5ae88aa8630f9250484dd40825fc355b173899f198fade621160bb2e3\"" Feb 13 16:04:49.057020 containerd[1557]: time="2025-02-13T16:04:49.039341344Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 16:04:49.057020 containerd[1557]: time="2025-02-13T16:04:49.039380547Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 16:04:49.057020 containerd[1557]: time="2025-02-13T16:04:49.039390409Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:49.057020 containerd[1557]: time="2025-02-13T16:04:49.039437501Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 16:04:49.060719 containerd[1557]: time="2025-02-13T16:04:49.060628233Z" level=info msg="StartContainer for \"481052526a7022bb31c9b310c6167ec3741333f5aeb827e023ff6b56896dc124\" returns successfully" Feb 13 16:04:49.077723 systemd[1]: Started cri-containerd-c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc.scope - libcontainer container c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc. Feb 13 16:04:49.080361 systemd[1]: Started cri-containerd-874b7af5ae88aa8630f9250484dd40825fc355b173899f198fade621160bb2e3.scope - libcontainer container 874b7af5ae88aa8630f9250484dd40825fc355b173899f198fade621160bb2e3. Feb 13 16:04:49.088297 systemd-resolved[1422]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 16:04:49.097021 containerd[1557]: time="2025-02-13T16:04:49.096999426Z" level=info msg="StartContainer for \"874b7af5ae88aa8630f9250484dd40825fc355b173899f198fade621160bb2e3\" returns successfully" Feb 13 16:04:49.125975 containerd[1557]: time="2025-02-13T16:04:49.125946988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6cd7f7ffd8-hvv7z,Uid:5eb4ecb4-cc42-4d1d-ab4a-ce503a56daa5,Namespace:calico-system,Attempt:7,} returns sandbox id \"c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc\"" Feb 13 16:04:49.130889 kubelet[2822]: I0213 16:04:49.130840 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-znr52" podStartSLOduration=24.130826926 podStartE2EDuration="24.130826926s" podCreationTimestamp="2025-02-13 16:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:04:49.130485625 +0000 UTC m=+28.549674300" watchObservedRunningTime="2025-02-13 16:04:49.130826926 +0000 UTC m=+28.550015606" Feb 13 16:04:49.131583 kubelet[2822]: I0213 16:04:49.130931 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-x9p69" podStartSLOduration=24.130926283 podStartE2EDuration="24.130926283s" podCreationTimestamp="2025-02-13 16:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 16:04:49.119112899 +0000 UTC m=+28.538301579" watchObservedRunningTime="2025-02-13 16:04:49.130926283 +0000 UTC m=+28.550114962" Feb 13 16:04:49.647923 systemd-networkd[1483]: calib9924be8bb8: Gained IPv6LL Feb 13 16:04:49.967738 systemd-networkd[1483]: cali168497c2b6c: Gained IPv6LL Feb 13 16:04:50.159974 systemd-networkd[1483]: calia99eb54da63: Gained IPv6LL Feb 13 16:04:50.160150 systemd-networkd[1483]: cali39dae87c33d: Gained IPv6LL Feb 13 16:04:50.236968 containerd[1557]: time="2025-02-13T16:04:50.236519930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:50.237220 containerd[1557]: time="2025-02-13T16:04:50.237020434Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 16:04:50.238158 containerd[1557]: time="2025-02-13T16:04:50.237513100Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:50.238552 containerd[1557]: time="2025-02-13T16:04:50.238536299Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:50.239025 containerd[1557]: time="2025-02-13T16:04:50.239011204Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.689006832s" Feb 13 16:04:50.239082 containerd[1557]: time="2025-02-13T16:04:50.239073599Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 16:04:50.240038 containerd[1557]: time="2025-02-13T16:04:50.240027694Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 16:04:50.240590 containerd[1557]: time="2025-02-13T16:04:50.240571866Z" level=info msg="CreateContainer within sandbox \"ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 16:04:50.253788 containerd[1557]: time="2025-02-13T16:04:50.253769297Z" level=info msg="CreateContainer within sandbox \"ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"677f47ba47654229c3a35fd93e1b0670b96e001dd802ad1990a442269597469e\"" Feb 13 16:04:50.254449 containerd[1557]: time="2025-02-13T16:04:50.254072702Z" level=info msg="StartContainer for \"677f47ba47654229c3a35fd93e1b0670b96e001dd802ad1990a442269597469e\"" Feb 13 16:04:50.272553 systemd[1]: run-containerd-runc-k8s.io-677f47ba47654229c3a35fd93e1b0670b96e001dd802ad1990a442269597469e-runc.WC31ka.mount: Deactivated successfully. Feb 13 16:04:50.278720 systemd[1]: Started cri-containerd-677f47ba47654229c3a35fd93e1b0670b96e001dd802ad1990a442269597469e.scope - libcontainer container 677f47ba47654229c3a35fd93e1b0670b96e001dd802ad1990a442269597469e. Feb 13 16:04:50.298352 containerd[1557]: time="2025-02-13T16:04:50.298327467Z" level=info msg="StartContainer for \"677f47ba47654229c3a35fd93e1b0670b96e001dd802ad1990a442269597469e\" returns successfully" Feb 13 16:04:50.607765 systemd-networkd[1483]: cali395e6fe55d3: Gained IPv6LL Feb 13 16:04:50.799837 systemd-networkd[1483]: cali869023cc7df: Gained IPv6LL Feb 13 16:04:52.488562 containerd[1557]: time="2025-02-13T16:04:52.488021611Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:52.489216 containerd[1557]: time="2025-02-13T16:04:52.489176518Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Feb 13 16:04:52.489721 containerd[1557]: time="2025-02-13T16:04:52.489704371Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:52.491433 containerd[1557]: time="2025-02-13T16:04:52.491417853Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:52.491926 containerd[1557]: time="2025-02-13T16:04:52.491903575Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.25181185s" Feb 13 16:04:52.492537 containerd[1557]: time="2025-02-13T16:04:52.492468432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 16:04:52.493672 containerd[1557]: time="2025-02-13T16:04:52.493516480Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 16:04:52.495635 containerd[1557]: time="2025-02-13T16:04:52.495610872Z" level=info msg="CreateContainer within sandbox \"749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 16:04:52.512829 containerd[1557]: time="2025-02-13T16:04:52.512808160Z" level=info msg="CreateContainer within sandbox \"749cbf7d7e1cae31db95ce6d667624dba7dd3dd625b367490ab62f40c8298886\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6eeff19a783a56bda435d50bd3a56114ddcafbc52112ac5dc42baf10a8bebca4\"" Feb 13 16:04:52.513578 containerd[1557]: time="2025-02-13T16:04:52.513248691Z" level=info msg="StartContainer for \"6eeff19a783a56bda435d50bd3a56114ddcafbc52112ac5dc42baf10a8bebca4\"" Feb 13 16:04:52.537757 systemd[1]: Started cri-containerd-6eeff19a783a56bda435d50bd3a56114ddcafbc52112ac5dc42baf10a8bebca4.scope - libcontainer container 6eeff19a783a56bda435d50bd3a56114ddcafbc52112ac5dc42baf10a8bebca4. Feb 13 16:04:52.568623 containerd[1557]: time="2025-02-13T16:04:52.567814703Z" level=info msg="StartContainer for \"6eeff19a783a56bda435d50bd3a56114ddcafbc52112ac5dc42baf10a8bebca4\" returns successfully" Feb 13 16:04:52.944312 containerd[1557]: time="2025-02-13T16:04:52.943942969Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:52.944757 containerd[1557]: time="2025-02-13T16:04:52.944739657Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Feb 13 16:04:52.945631 containerd[1557]: time="2025-02-13T16:04:52.945618678Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 452.082494ms" Feb 13 16:04:52.946044 containerd[1557]: time="2025-02-13T16:04:52.945701019Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 16:04:52.946314 containerd[1557]: time="2025-02-13T16:04:52.946304557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Feb 13 16:04:52.947382 containerd[1557]: time="2025-02-13T16:04:52.947329425Z" level=info msg="CreateContainer within sandbox \"faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 16:04:52.965213 containerd[1557]: time="2025-02-13T16:04:52.965191583Z" level=info msg="CreateContainer within sandbox \"faa269c5c329bb284ac4a115d1af68605cb3038d18e27c973d329817bd6c0451\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4c684b31d835b4c37d98415d3e8d06c3ba98cd7350c5bc0fac39c8cfc0e35331\"" Feb 13 16:04:52.965805 containerd[1557]: time="2025-02-13T16:04:52.965604478Z" level=info msg="StartContainer for \"4c684b31d835b4c37d98415d3e8d06c3ba98cd7350c5bc0fac39c8cfc0e35331\"" Feb 13 16:04:52.984732 systemd[1]: Started cri-containerd-4c684b31d835b4c37d98415d3e8d06c3ba98cd7350c5bc0fac39c8cfc0e35331.scope - libcontainer container 4c684b31d835b4c37d98415d3e8d06c3ba98cd7350c5bc0fac39c8cfc0e35331. Feb 13 16:04:53.018167 containerd[1557]: time="2025-02-13T16:04:53.018146976Z" level=info msg="StartContainer for \"4c684b31d835b4c37d98415d3e8d06c3ba98cd7350c5bc0fac39c8cfc0e35331\" returns successfully" Feb 13 16:04:53.145166 kubelet[2822]: I0213 16:04:53.145091 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85f996756d-tfwqr" podStartSLOduration=18.284640934 podStartE2EDuration="22.145079782s" podCreationTimestamp="2025-02-13 16:04:31 +0000 UTC" firstStartedPulling="2025-02-13 16:04:48.632717906 +0000 UTC m=+28.051906575" lastFinishedPulling="2025-02-13 16:04:52.493156734 +0000 UTC m=+31.912345423" observedRunningTime="2025-02-13 16:04:53.145078357 +0000 UTC m=+32.564267037" watchObservedRunningTime="2025-02-13 16:04:53.145079782 +0000 UTC m=+32.564268457" Feb 13 16:04:53.165949 kubelet[2822]: I0213 16:04:53.165908 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-85f996756d-tldgb" podStartSLOduration=17.940770432 podStartE2EDuration="22.165898296s" podCreationTimestamp="2025-02-13 16:04:31 +0000 UTC" firstStartedPulling="2025-02-13 16:04:48.721111267 +0000 UTC m=+28.140299938" lastFinishedPulling="2025-02-13 16:04:52.946239131 +0000 UTC m=+32.365427802" observedRunningTime="2025-02-13 16:04:53.165793111 +0000 UTC m=+32.584981792" watchObservedRunningTime="2025-02-13 16:04:53.165898296 +0000 UTC m=+32.585086971" Feb 13 16:04:54.071459 kubelet[2822]: I0213 16:04:54.071427 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:04:54.135072 kubelet[2822]: I0213 16:04:54.135053 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:04:54.135235 kubelet[2822]: I0213 16:04:54.135186 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:04:54.287673 kernel: bpftool[5682]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 16:04:54.567749 systemd-networkd[1483]: vxlan.calico: Link UP Feb 13 16:04:54.567756 systemd-networkd[1483]: vxlan.calico: Gained carrier Feb 13 16:04:55.339727 containerd[1557]: time="2025-02-13T16:04:55.339690690Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:55.340361 containerd[1557]: time="2025-02-13T16:04:55.340334768Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Feb 13 16:04:55.342701 containerd[1557]: time="2025-02-13T16:04:55.342669533Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:55.344463 containerd[1557]: time="2025-02-13T16:04:55.343670046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:55.344463 containerd[1557]: time="2025-02-13T16:04:55.344374874Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.397918679s" Feb 13 16:04:55.344463 containerd[1557]: time="2025-02-13T16:04:55.344393726Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Feb 13 16:04:55.345661 containerd[1557]: time="2025-02-13T16:04:55.345078159Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 16:04:55.354840 kubelet[2822]: I0213 16:04:55.354827 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:04:55.355698 containerd[1557]: time="2025-02-13T16:04:55.355679884Z" level=info msg="CreateContainer within sandbox \"c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Feb 13 16:04:55.365788 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount128709038.mount: Deactivated successfully. Feb 13 16:04:55.373242 containerd[1557]: time="2025-02-13T16:04:55.373221202Z" level=info msg="CreateContainer within sandbox \"c5882b62250e833c50539ec5a354b1619b501f9deb2f5db4d878f1d7ac2d19cc\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9569252e8bd102502746f4b39bbc2e5d12c9c8d75e20cc71b5ebd828e4fe276e\"" Feb 13 16:04:55.374832 containerd[1557]: time="2025-02-13T16:04:55.374815510Z" level=info msg="StartContainer for \"9569252e8bd102502746f4b39bbc2e5d12c9c8d75e20cc71b5ebd828e4fe276e\"" Feb 13 16:04:55.419840 systemd[1]: Started cri-containerd-9569252e8bd102502746f4b39bbc2e5d12c9c8d75e20cc71b5ebd828e4fe276e.scope - libcontainer container 9569252e8bd102502746f4b39bbc2e5d12c9c8d75e20cc71b5ebd828e4fe276e. Feb 13 16:04:55.456805 containerd[1557]: time="2025-02-13T16:04:55.456777955Z" level=info msg="StartContainer for \"9569252e8bd102502746f4b39bbc2e5d12c9c8d75e20cc71b5ebd828e4fe276e\" returns successfully" Feb 13 16:04:56.149702 kubelet[2822]: I0213 16:04:56.149366 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6cd7f7ffd8-hvv7z" podStartSLOduration=18.931178427 podStartE2EDuration="25.149350746s" podCreationTimestamp="2025-02-13 16:04:31 +0000 UTC" firstStartedPulling="2025-02-13 16:04:49.126798656 +0000 UTC m=+28.545987324" lastFinishedPulling="2025-02-13 16:04:55.344970971 +0000 UTC m=+34.764159643" observedRunningTime="2025-02-13 16:04:56.148977094 +0000 UTC m=+35.568165781" watchObservedRunningTime="2025-02-13 16:04:56.149350746 +0000 UTC m=+35.568539429" Feb 13 16:04:56.387105 kubelet[2822]: I0213 16:04:56.386824 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:04:56.495743 systemd-networkd[1483]: vxlan.calico: Gained IPv6LL Feb 13 16:04:57.150492 kubelet[2822]: I0213 16:04:57.150399 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:04:57.899998 containerd[1557]: time="2025-02-13T16:04:57.899744924Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:57.905217 containerd[1557]: time="2025-02-13T16:04:57.905184886Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 16:04:57.913473 containerd[1557]: time="2025-02-13T16:04:57.913439816Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:57.921993 containerd[1557]: time="2025-02-13T16:04:57.921959168Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 16:04:57.922476 containerd[1557]: time="2025-02-13T16:04:57.922454516Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 2.577359149s" Feb 13 16:04:57.922518 containerd[1557]: time="2025-02-13T16:04:57.922477240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 16:04:57.928853 containerd[1557]: time="2025-02-13T16:04:57.928820085Z" level=info msg="CreateContainer within sandbox \"ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 16:04:57.967056 containerd[1557]: time="2025-02-13T16:04:57.967026750Z" level=info msg="CreateContainer within sandbox \"ef495f5d22414d42df9c6d9aab7c968a86fae4d443c7df8ef599080c85ebe7b6\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"ada84389f4e3f4c2aeb75706e213b81338286fb8b0f33b7e1851dd164d21699d\"" Feb 13 16:04:57.968786 containerd[1557]: time="2025-02-13T16:04:57.967536060Z" level=info msg="StartContainer for \"ada84389f4e3f4c2aeb75706e213b81338286fb8b0f33b7e1851dd164d21699d\"" Feb 13 16:04:57.968174 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1557081575.mount: Deactivated successfully. Feb 13 16:04:57.987537 systemd[1]: run-containerd-runc-k8s.io-ada84389f4e3f4c2aeb75706e213b81338286fb8b0f33b7e1851dd164d21699d-runc.V5GgWL.mount: Deactivated successfully. Feb 13 16:04:57.992729 systemd[1]: Started cri-containerd-ada84389f4e3f4c2aeb75706e213b81338286fb8b0f33b7e1851dd164d21699d.scope - libcontainer container ada84389f4e3f4c2aeb75706e213b81338286fb8b0f33b7e1851dd164d21699d. Feb 13 16:04:58.010103 containerd[1557]: time="2025-02-13T16:04:58.010041919Z" level=info msg="StartContainer for \"ada84389f4e3f4c2aeb75706e213b81338286fb8b0f33b7e1851dd164d21699d\" returns successfully" Feb 13 16:04:58.178190 kubelet[2822]: I0213 16:04:58.178048 2822 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-tndq9" podStartSLOduration=17.803036021 podStartE2EDuration="27.178029942s" podCreationTimestamp="2025-02-13 16:04:31 +0000 UTC" firstStartedPulling="2025-02-13 16:04:48.549723829 +0000 UTC m=+27.968912497" lastFinishedPulling="2025-02-13 16:04:57.924717741 +0000 UTC m=+37.343906418" observedRunningTime="2025-02-13 16:04:58.170951877 +0000 UTC m=+37.590140565" watchObservedRunningTime="2025-02-13 16:04:58.178029942 +0000 UTC m=+37.597218624" Feb 13 16:04:58.924918 kubelet[2822]: I0213 16:04:58.924900 2822 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 16:04:58.931463 kubelet[2822]: I0213 16:04:58.931455 2822 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 16:05:00.559414 kubelet[2822]: I0213 16:05:00.559016 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:05:01.435253 kubelet[2822]: I0213 16:05:01.435080 2822 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 16:05:20.699441 containerd[1557]: time="2025-02-13T16:05:20.699348784Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:05:20.709995 containerd[1557]: time="2025-02-13T16:05:20.702295892Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:05:20.709995 containerd[1557]: time="2025-02-13T16:05:20.709960861Z" level=info msg="StopPodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:05:20.722169 containerd[1557]: time="2025-02-13T16:05:20.722149971Z" level=info msg="RemovePodSandbox for \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:05:20.728499 containerd[1557]: time="2025-02-13T16:05:20.728483215Z" level=info msg="Forcibly stopping sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\"" Feb 13 16:05:20.731329 containerd[1557]: time="2025-02-13T16:05:20.728532265Z" level=info msg="TearDown network for sandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" successfully" Feb 13 16:05:20.735195 containerd[1557]: time="2025-02-13T16:05:20.735175875Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.738521 containerd[1557]: time="2025-02-13T16:05:20.738441346Z" level=info msg="RemovePodSandbox \"b9f9d15d5e4640670b98ea087a98eff133799fd98accc2566e609b15a2e65aae\" returns successfully" Feb 13 16:05:20.743469 containerd[1557]: time="2025-02-13T16:05:20.743458432Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:05:20.744402 containerd[1557]: time="2025-02-13T16:05:20.743584973Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:05:20.744402 containerd[1557]: time="2025-02-13T16:05:20.743594341Z" level=info msg="StopPodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:05:20.744402 containerd[1557]: time="2025-02-13T16:05:20.743716131Z" level=info msg="RemovePodSandbox for \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:05:20.744402 containerd[1557]: time="2025-02-13T16:05:20.743727664Z" level=info msg="Forcibly stopping sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\"" Feb 13 16:05:20.744402 containerd[1557]: time="2025-02-13T16:05:20.743757013Z" level=info msg="TearDown network for sandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" successfully" Feb 13 16:05:20.745366 containerd[1557]: time="2025-02-13T16:05:20.745326428Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.745366 containerd[1557]: time="2025-02-13T16:05:20.745349019Z" level=info msg="RemovePodSandbox \"342a57954450baad6b9167ded61de1f14c0f1cb4d03fb2f5c0adf134f91ade6a\" returns successfully" Feb 13 16:05:20.745635 containerd[1557]: time="2025-02-13T16:05:20.745558478Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:05:20.745635 containerd[1557]: time="2025-02-13T16:05:20.745603191Z" level=info msg="TearDown network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" successfully" Feb 13 16:05:20.745635 containerd[1557]: time="2025-02-13T16:05:20.745609902Z" level=info msg="StopPodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" returns successfully" Feb 13 16:05:20.746254 containerd[1557]: time="2025-02-13T16:05:20.745804635Z" level=info msg="RemovePodSandbox for \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:05:20.746254 containerd[1557]: time="2025-02-13T16:05:20.745815711Z" level=info msg="Forcibly stopping sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\"" Feb 13 16:05:20.746254 containerd[1557]: time="2025-02-13T16:05:20.745872058Z" level=info msg="TearDown network for sandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" successfully" Feb 13 16:05:20.747338 containerd[1557]: time="2025-02-13T16:05:20.747327192Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.747406 containerd[1557]: time="2025-02-13T16:05:20.747397257Z" level=info msg="RemovePodSandbox \"517c92ab7cb0026f5cde35a50e094d09c973a41db4b1f5f6f57e79568258cdcf\" returns successfully" Feb 13 16:05:20.747553 containerd[1557]: time="2025-02-13T16:05:20.747544301Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" Feb 13 16:05:20.747675 containerd[1557]: time="2025-02-13T16:05:20.747665840Z" level=info msg="TearDown network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" successfully" Feb 13 16:05:20.747717 containerd[1557]: time="2025-02-13T16:05:20.747710137Z" level=info msg="StopPodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" returns successfully" Feb 13 16:05:20.748155 containerd[1557]: time="2025-02-13T16:05:20.748146429Z" level=info msg="RemovePodSandbox for \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" Feb 13 16:05:20.748223 containerd[1557]: time="2025-02-13T16:05:20.748198128Z" level=info msg="Forcibly stopping sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\"" Feb 13 16:05:20.748302 containerd[1557]: time="2025-02-13T16:05:20.748280612Z" level=info msg="TearDown network for sandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" successfully" Feb 13 16:05:20.749419 containerd[1557]: time="2025-02-13T16:05:20.749390153Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.749491 containerd[1557]: time="2025-02-13T16:05:20.749482591Z" level=info msg="RemovePodSandbox \"1113276c24c2992495472eccbcd3abb9b8aba2cab498b7a429b0288064997fb8\" returns successfully" Feb 13 16:05:20.749765 containerd[1557]: time="2025-02-13T16:05:20.749678982Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\"" Feb 13 16:05:20.749765 containerd[1557]: time="2025-02-13T16:05:20.749733332Z" level=info msg="TearDown network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" successfully" Feb 13 16:05:20.749765 containerd[1557]: time="2025-02-13T16:05:20.749739145Z" level=info msg="StopPodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" returns successfully" Feb 13 16:05:20.750656 containerd[1557]: time="2025-02-13T16:05:20.750613521Z" level=info msg="RemovePodSandbox for \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\"" Feb 13 16:05:20.750656 containerd[1557]: time="2025-02-13T16:05:20.750625199Z" level=info msg="Forcibly stopping sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\"" Feb 13 16:05:20.751042 containerd[1557]: time="2025-02-13T16:05:20.750755987Z" level=info msg="TearDown network for sandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" successfully" Feb 13 16:05:20.751939 containerd[1557]: time="2025-02-13T16:05:20.751927819Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.752691 containerd[1557]: time="2025-02-13T16:05:20.752072861Z" level=info msg="RemovePodSandbox \"3980a28850f979c123f5ee67e0536d15934eba3fde8d338525a4e684ce4c7b4f\" returns successfully" Feb 13 16:05:20.752778 containerd[1557]: time="2025-02-13T16:05:20.752767037Z" level=info msg="StopPodSandbox for \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\"" Feb 13 16:05:20.752878 containerd[1557]: time="2025-02-13T16:05:20.752868054Z" level=info msg="TearDown network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" successfully" Feb 13 16:05:20.753075 containerd[1557]: time="2025-02-13T16:05:20.752951653Z" level=info msg="StopPodSandbox for \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" returns successfully" Feb 13 16:05:20.753101 containerd[1557]: time="2025-02-13T16:05:20.753074955Z" level=info msg="RemovePodSandbox for \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\"" Feb 13 16:05:20.753101 containerd[1557]: time="2025-02-13T16:05:20.753086766Z" level=info msg="Forcibly stopping sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\"" Feb 13 16:05:20.753689 containerd[1557]: time="2025-02-13T16:05:20.753125006Z" level=info msg="TearDown network for sandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" successfully" Feb 13 16:05:20.755533 containerd[1557]: time="2025-02-13T16:05:20.755465576Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.755533 containerd[1557]: time="2025-02-13T16:05:20.755497336Z" level=info msg="RemovePodSandbox \"b59b81f46afb738cf089a215befb675494122b58e3dd2c679aed44fb84c28777\" returns successfully" Feb 13 16:05:20.755831 containerd[1557]: time="2025-02-13T16:05:20.755742761Z" level=info msg="StopPodSandbox for \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\"" Feb 13 16:05:20.755831 containerd[1557]: time="2025-02-13T16:05:20.755799082Z" level=info msg="TearDown network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\" successfully" Feb 13 16:05:20.755831 containerd[1557]: time="2025-02-13T16:05:20.755806159Z" level=info msg="StopPodSandbox for \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\" returns successfully" Feb 13 16:05:20.756057 containerd[1557]: time="2025-02-13T16:05:20.756009542Z" level=info msg="RemovePodSandbox for \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\"" Feb 13 16:05:20.756057 containerd[1557]: time="2025-02-13T16:05:20.756024013Z" level=info msg="Forcibly stopping sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\"" Feb 13 16:05:20.756151 containerd[1557]: time="2025-02-13T16:05:20.756061238Z" level=info msg="TearDown network for sandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\" successfully" Feb 13 16:05:20.757788 containerd[1557]: time="2025-02-13T16:05:20.757772929Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.757842 containerd[1557]: time="2025-02-13T16:05:20.757795875Z" level=info msg="RemovePodSandbox \"d59e53b038244ca69c09105d700a8ea7fec5f8eeae4c5c901c76208ab2ac0444\" returns successfully" Feb 13 16:05:20.758103 containerd[1557]: time="2025-02-13T16:05:20.757985619Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:05:20.758103 containerd[1557]: time="2025-02-13T16:05:20.758028814Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:05:20.758103 containerd[1557]: time="2025-02-13T16:05:20.758035269Z" level=info msg="StopPodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.758251289Z" level=info msg="RemovePodSandbox for \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.758262070Z" level=info msg="Forcibly stopping sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.758304297Z" level=info msg="TearDown network for sandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759494071Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759516507Z" level=info msg="RemovePodSandbox \"4d9b8ba0d8a8a1b5a82be2bbb7c0b967d7a988f148c0e6800e2c0c85736c5da9\" returns successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759654466Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759697217Z" level=info msg="TearDown network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759703565Z" level=info msg="StopPodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" returns successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759865429Z" level=info msg="RemovePodSandbox for \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759877028Z" level=info msg="Forcibly stopping sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.759914674Z" level=info msg="TearDown network for sandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.760997845Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.761016265Z" level=info msg="RemovePodSandbox \"c90a2cc48185e33bbc043c8a8ef18c205262ff6e7a16326926734d385025aeac\" returns successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.761196647Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.761253910Z" level=info msg="TearDown network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.761261152Z" level=info msg="StopPodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" returns successfully" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.761396571Z" level=info msg="RemovePodSandbox for \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.761405456Z" level=info msg="Forcibly stopping sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\"" Feb 13 16:05:20.762327 containerd[1557]: time="2025-02-13T16:05:20.761480838Z" level=info msg="TearDown network for sandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" successfully" Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.762803963Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.762822066Z" level=info msg="RemovePodSandbox \"0505e95e79c19626dd61353e214ac7768f30cb37b1d353c1707ad0353bad7323\" returns successfully" Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.763003785Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\"" Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.763044002Z" level=info msg="TearDown network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" successfully" Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.763050120Z" level=info msg="StopPodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" returns successfully" Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.763166461Z" level=info msg="RemovePodSandbox for \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\"" Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.763176408Z" level=info msg="Forcibly stopping sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\"" Feb 13 16:05:20.763499 containerd[1557]: time="2025-02-13T16:05:20.763214447Z" level=info msg="TearDown network for sandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" successfully" Feb 13 16:05:20.764615 containerd[1557]: time="2025-02-13T16:05:20.764599727Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.768430 containerd[1557]: time="2025-02-13T16:05:20.768413377Z" level=info msg="RemovePodSandbox \"e62e86f7b12dedbf927486d95368752aaa5ad30a9214803d0ffde7189f915a44\" returns successfully" Feb 13 16:05:20.769517 containerd[1557]: time="2025-02-13T16:05:20.768697489Z" level=info msg="StopPodSandbox for \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\"" Feb 13 16:05:20.769517 containerd[1557]: time="2025-02-13T16:05:20.768763153Z" level=info msg="TearDown network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" successfully" Feb 13 16:05:20.769517 containerd[1557]: time="2025-02-13T16:05:20.768769452Z" level=info msg="StopPodSandbox for \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" returns successfully" Feb 13 16:05:20.769517 containerd[1557]: time="2025-02-13T16:05:20.768891322Z" level=info msg="RemovePodSandbox for \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\"" Feb 13 16:05:20.769517 containerd[1557]: time="2025-02-13T16:05:20.768900594Z" level=info msg="Forcibly stopping sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\"" Feb 13 16:05:20.769517 containerd[1557]: time="2025-02-13T16:05:20.768928447Z" level=info msg="TearDown network for sandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" successfully" Feb 13 16:05:20.770251 containerd[1557]: time="2025-02-13T16:05:20.770238963Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.770300 containerd[1557]: time="2025-02-13T16:05:20.770292483Z" level=info msg="RemovePodSandbox \"aba602d4aff48bfe9f52debfae571c0375d2e64390b4f56cd6b33d62af7b16c0\" returns successfully" Feb 13 16:05:20.770554 containerd[1557]: time="2025-02-13T16:05:20.770540817Z" level=info msg="StopPodSandbox for \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\"" Feb 13 16:05:20.770594 containerd[1557]: time="2025-02-13T16:05:20.770583693Z" level=info msg="TearDown network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\" successfully" Feb 13 16:05:20.770623 containerd[1557]: time="2025-02-13T16:05:20.770607956Z" level=info msg="StopPodSandbox for \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\" returns successfully" Feb 13 16:05:20.770890 containerd[1557]: time="2025-02-13T16:05:20.770735177Z" level=info msg="RemovePodSandbox for \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\"" Feb 13 16:05:20.770890 containerd[1557]: time="2025-02-13T16:05:20.770745458Z" level=info msg="Forcibly stopping sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\"" Feb 13 16:05:20.770890 containerd[1557]: time="2025-02-13T16:05:20.770775877Z" level=info msg="TearDown network for sandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\" successfully" Feb 13 16:05:20.772051 containerd[1557]: time="2025-02-13T16:05:20.772036897Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.772121 containerd[1557]: time="2025-02-13T16:05:20.772058593Z" level=info msg="RemovePodSandbox \"5a40c3b7a255d6cebab82d03b24208676fc9cb699fce72af1014e9d2bb387ddb\" returns successfully" Feb 13 16:05:20.772382 containerd[1557]: time="2025-02-13T16:05:20.772257621Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:05:20.772382 containerd[1557]: time="2025-02-13T16:05:20.772302253Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:05:20.772382 containerd[1557]: time="2025-02-13T16:05:20.772308589Z" level=info msg="StopPodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:05:20.772729 containerd[1557]: time="2025-02-13T16:05:20.772509564Z" level=info msg="RemovePodSandbox for \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:05:20.772729 containerd[1557]: time="2025-02-13T16:05:20.772520590Z" level=info msg="Forcibly stopping sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\"" Feb 13 16:05:20.772729 containerd[1557]: time="2025-02-13T16:05:20.772559103Z" level=info msg="TearDown network for sandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" successfully" Feb 13 16:05:20.777893 containerd[1557]: time="2025-02-13T16:05:20.777878904Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.777923 containerd[1557]: time="2025-02-13T16:05:20.777899682Z" level=info msg="RemovePodSandbox \"f4cc446ddb7300afb1de27e30eef2cd12326fa8a1c11eb938e070e1a7adedc34\" returns successfully" Feb 13 16:05:20.778101 containerd[1557]: time="2025-02-13T16:05:20.778028418Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:05:20.778101 containerd[1557]: time="2025-02-13T16:05:20.778069549Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:05:20.778101 containerd[1557]: time="2025-02-13T16:05:20.778075673Z" level=info msg="StopPodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:05:20.778195 containerd[1557]: time="2025-02-13T16:05:20.778180045Z" level=info msg="RemovePodSandbox for \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:05:20.778195 containerd[1557]: time="2025-02-13T16:05:20.778194153Z" level=info msg="Forcibly stopping sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\"" Feb 13 16:05:20.778243 containerd[1557]: time="2025-02-13T16:05:20.778226989Z" level=info msg="TearDown network for sandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" successfully" Feb 13 16:05:20.779293 containerd[1557]: time="2025-02-13T16:05:20.779279123Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.779324 containerd[1557]: time="2025-02-13T16:05:20.779300025Z" level=info msg="RemovePodSandbox \"61cbe63a16e115eae1f63dbcf387fad691d9ee704a31e5a3696a16eea9a81556\" returns successfully" Feb 13 16:05:20.779504 containerd[1557]: time="2025-02-13T16:05:20.779470588Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:05:20.779614 containerd[1557]: time="2025-02-13T16:05:20.779572096Z" level=info msg="TearDown network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" successfully" Feb 13 16:05:20.779614 containerd[1557]: time="2025-02-13T16:05:20.779580298Z" level=info msg="StopPodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" returns successfully" Feb 13 16:05:20.779878 containerd[1557]: time="2025-02-13T16:05:20.779790808Z" level=info msg="RemovePodSandbox for \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:05:20.779878 containerd[1557]: time="2025-02-13T16:05:20.779802038Z" level=info msg="Forcibly stopping sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\"" Feb 13 16:05:20.779878 containerd[1557]: time="2025-02-13T16:05:20.779829835Z" level=info msg="TearDown network for sandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" successfully" Feb 13 16:05:20.781001 containerd[1557]: time="2025-02-13T16:05:20.780988977Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.781149 containerd[1557]: time="2025-02-13T16:05:20.781054107Z" level=info msg="RemovePodSandbox \"70eae9e62d9fdb9b7d2975c75c781fd6ddb844339ff6ea9a2c8c42590caeee1d\" returns successfully" Feb 13 16:05:20.781187 containerd[1557]: time="2025-02-13T16:05:20.781177308Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" Feb 13 16:05:20.781225 containerd[1557]: time="2025-02-13T16:05:20.781213450Z" level=info msg="TearDown network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" successfully" Feb 13 16:05:20.781225 containerd[1557]: time="2025-02-13T16:05:20.781221754Z" level=info msg="StopPodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" returns successfully" Feb 13 16:05:20.781707 containerd[1557]: time="2025-02-13T16:05:20.781362581Z" level=info msg="RemovePodSandbox for \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" Feb 13 16:05:20.781707 containerd[1557]: time="2025-02-13T16:05:20.781375304Z" level=info msg="Forcibly stopping sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\"" Feb 13 16:05:20.781707 containerd[1557]: time="2025-02-13T16:05:20.781426593Z" level=info msg="TearDown network for sandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" successfully" Feb 13 16:05:20.782687 containerd[1557]: time="2025-02-13T16:05:20.782675580Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.782746 containerd[1557]: time="2025-02-13T16:05:20.782737738Z" level=info msg="RemovePodSandbox \"620fb7370fec71533511e490718b4f9f0a44bcf17b6158722d064f5dfd91caaf\" returns successfully" Feb 13 16:05:20.783694 containerd[1557]: time="2025-02-13T16:05:20.783009400Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\"" Feb 13 16:05:20.783694 containerd[1557]: time="2025-02-13T16:05:20.783044471Z" level=info msg="TearDown network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" successfully" Feb 13 16:05:20.783694 containerd[1557]: time="2025-02-13T16:05:20.783050113Z" level=info msg="StopPodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" returns successfully" Feb 13 16:05:20.783694 containerd[1557]: time="2025-02-13T16:05:20.783181349Z" level=info msg="RemovePodSandbox for \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\"" Feb 13 16:05:20.783694 containerd[1557]: time="2025-02-13T16:05:20.783192924Z" level=info msg="Forcibly stopping sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\"" Feb 13 16:05:20.783694 containerd[1557]: time="2025-02-13T16:05:20.783266221Z" level=info msg="TearDown network for sandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" successfully" Feb 13 16:05:20.784356 containerd[1557]: time="2025-02-13T16:05:20.784343104Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.784385 containerd[1557]: time="2025-02-13T16:05:20.784363107Z" level=info msg="RemovePodSandbox \"abf6974dec7b7c42b7c18b3d369106f64c2d19b8bbae6fe8672cb3a050b7f2db\" returns successfully" Feb 13 16:05:20.784593 containerd[1557]: time="2025-02-13T16:05:20.784516730Z" level=info msg="StopPodSandbox for \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\"" Feb 13 16:05:20.784593 containerd[1557]: time="2025-02-13T16:05:20.784555719Z" level=info msg="TearDown network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" successfully" Feb 13 16:05:20.784593 containerd[1557]: time="2025-02-13T16:05:20.784561719Z" level=info msg="StopPodSandbox for \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" returns successfully" Feb 13 16:05:20.785041 containerd[1557]: time="2025-02-13T16:05:20.784673720Z" level=info msg="RemovePodSandbox for \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\"" Feb 13 16:05:20.785041 containerd[1557]: time="2025-02-13T16:05:20.784683432Z" level=info msg="Forcibly stopping sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\"" Feb 13 16:05:20.785041 containerd[1557]: time="2025-02-13T16:05:20.784714395Z" level=info msg="TearDown network for sandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" successfully" Feb 13 16:05:20.785840 containerd[1557]: time="2025-02-13T16:05:20.785824868Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.785872 containerd[1557]: time="2025-02-13T16:05:20.785846788Z" level=info msg="RemovePodSandbox \"6f71638e3f9789f0c74e6044292aa2b428f2942869869ed793d04cc3bb32721b\" returns successfully" Feb 13 16:05:20.785983 containerd[1557]: time="2025-02-13T16:05:20.785972892Z" level=info msg="StopPodSandbox for \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\"" Feb 13 16:05:20.786118 containerd[1557]: time="2025-02-13T16:05:20.786083622Z" level=info msg="TearDown network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\" successfully" Feb 13 16:05:20.786118 containerd[1557]: time="2025-02-13T16:05:20.786092036Z" level=info msg="StopPodSandbox for \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\" returns successfully" Feb 13 16:05:20.786990 containerd[1557]: time="2025-02-13T16:05:20.786196665Z" level=info msg="RemovePodSandbox for \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\"" Feb 13 16:05:20.786990 containerd[1557]: time="2025-02-13T16:05:20.786206618Z" level=info msg="Forcibly stopping sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\"" Feb 13 16:05:20.786990 containerd[1557]: time="2025-02-13T16:05:20.786233889Z" level=info msg="TearDown network for sandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\" successfully" Feb 13 16:05:20.787355 containerd[1557]: time="2025-02-13T16:05:20.787343023Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.787424 containerd[1557]: time="2025-02-13T16:05:20.787414765Z" level=info msg="RemovePodSandbox \"b55f3c1b4b262352f6c6409cbf86e50c61c14b378866ae4678ac0888d6662154\" returns successfully" Feb 13 16:05:20.787576 containerd[1557]: time="2025-02-13T16:05:20.787563089Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:05:20.787610 containerd[1557]: time="2025-02-13T16:05:20.787603363Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:05:20.787630 containerd[1557]: time="2025-02-13T16:05:20.787609461Z" level=info msg="StopPodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:05:20.787753 containerd[1557]: time="2025-02-13T16:05:20.787737724Z" level=info msg="RemovePodSandbox for \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:05:20.787779 containerd[1557]: time="2025-02-13T16:05:20.787773612Z" level=info msg="Forcibly stopping sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\"" Feb 13 16:05:20.787845 containerd[1557]: time="2025-02-13T16:05:20.787803235Z" level=info msg="TearDown network for sandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" successfully" Feb 13 16:05:20.788887 containerd[1557]: time="2025-02-13T16:05:20.788872527Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.788920 containerd[1557]: time="2025-02-13T16:05:20.788893486Z" level=info msg="RemovePodSandbox \"e9050e7d90ab14c32758ad1842a981c25457e3cba986ccff94a8c26ad3b6dce5\" returns successfully" Feb 13 16:05:20.789035 containerd[1557]: time="2025-02-13T16:05:20.789025243Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:05:20.789149 containerd[1557]: time="2025-02-13T16:05:20.789111186Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:05:20.789149 containerd[1557]: time="2025-02-13T16:05:20.789119621Z" level=info msg="StopPodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:05:20.789271 containerd[1557]: time="2025-02-13T16:05:20.789257474Z" level=info msg="RemovePodSandbox for \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:05:20.789303 containerd[1557]: time="2025-02-13T16:05:20.789271870Z" level=info msg="Forcibly stopping sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\"" Feb 13 16:05:20.789381 containerd[1557]: time="2025-02-13T16:05:20.789318663Z" level=info msg="TearDown network for sandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" successfully" Feb 13 16:05:20.790443 containerd[1557]: time="2025-02-13T16:05:20.790429063Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.790472 containerd[1557]: time="2025-02-13T16:05:20.790449014Z" level=info msg="RemovePodSandbox \"492974427d9dcf9642db84cd0e5a983a460952b751d3feddca272a4d066c3816\" returns successfully" Feb 13 16:05:20.790742 containerd[1557]: time="2025-02-13T16:05:20.790598327Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:05:20.790742 containerd[1557]: time="2025-02-13T16:05:20.790636323Z" level=info msg="TearDown network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" successfully" Feb 13 16:05:20.790742 containerd[1557]: time="2025-02-13T16:05:20.790674759Z" level=info msg="StopPodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" returns successfully" Feb 13 16:05:20.790823 containerd[1557]: time="2025-02-13T16:05:20.790806491Z" level=info msg="RemovePodSandbox for \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:05:20.790853 containerd[1557]: time="2025-02-13T16:05:20.790819139Z" level=info msg="Forcibly stopping sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\"" Feb 13 16:05:20.790897 containerd[1557]: time="2025-02-13T16:05:20.790876004Z" level=info msg="TearDown network for sandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" successfully" Feb 13 16:05:20.792007 containerd[1557]: time="2025-02-13T16:05:20.791992416Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.792061 containerd[1557]: time="2025-02-13T16:05:20.792014070Z" level=info msg="RemovePodSandbox \"c9dfbc36d14bb870030c01b9439030e0564b82a510e8e2b70f7c69a6e574a44c\" returns successfully" Feb 13 16:05:20.792207 containerd[1557]: time="2025-02-13T16:05:20.792122350Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" Feb 13 16:05:20.792207 containerd[1557]: time="2025-02-13T16:05:20.792160446Z" level=info msg="TearDown network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" successfully" Feb 13 16:05:20.792207 containerd[1557]: time="2025-02-13T16:05:20.792167018Z" level=info msg="StopPodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" returns successfully" Feb 13 16:05:20.792326 containerd[1557]: time="2025-02-13T16:05:20.792305958Z" level=info msg="RemovePodSandbox for \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" Feb 13 16:05:20.792326 containerd[1557]: time="2025-02-13T16:05:20.792320258Z" level=info msg="Forcibly stopping sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\"" Feb 13 16:05:20.792554 containerd[1557]: time="2025-02-13T16:05:20.792352014Z" level=info msg="TearDown network for sandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" successfully" Feb 13 16:05:20.793500 containerd[1557]: time="2025-02-13T16:05:20.793485659Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.793530 containerd[1557]: time="2025-02-13T16:05:20.793505541Z" level=info msg="RemovePodSandbox \"5b4f292555c7d3ad13f80481f1de2a9288a8727803e0cab0511fe77b681fbb7f\" returns successfully" Feb 13 16:05:20.793657 containerd[1557]: time="2025-02-13T16:05:20.793619055Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\"" Feb 13 16:05:20.793723 containerd[1557]: time="2025-02-13T16:05:20.793714947Z" level=info msg="TearDown network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" successfully" Feb 13 16:05:20.793850 containerd[1557]: time="2025-02-13T16:05:20.793756942Z" level=info msg="StopPodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" returns successfully" Feb 13 16:05:20.793900 containerd[1557]: time="2025-02-13T16:05:20.793882424Z" level=info msg="RemovePodSandbox for \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\"" Feb 13 16:05:20.793900 containerd[1557]: time="2025-02-13T16:05:20.793895482Z" level=info msg="Forcibly stopping sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\"" Feb 13 16:05:20.793941 containerd[1557]: time="2025-02-13T16:05:20.793926581Z" level=info msg="TearDown network for sandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" successfully" Feb 13 16:05:20.795024 containerd[1557]: time="2025-02-13T16:05:20.795008612Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.795071 containerd[1557]: time="2025-02-13T16:05:20.795030509Z" level=info msg="RemovePodSandbox \"fe80e0cbdb5b896aa76dea3f355e7fb73fe7041d43fb03ecd104f75e4e4fe93c\" returns successfully" Feb 13 16:05:20.795309 containerd[1557]: time="2025-02-13T16:05:20.795241309Z" level=info msg="StopPodSandbox for \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\"" Feb 13 16:05:20.795309 containerd[1557]: time="2025-02-13T16:05:20.795279766Z" level=info msg="TearDown network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" successfully" Feb 13 16:05:20.795309 containerd[1557]: time="2025-02-13T16:05:20.795286027Z" level=info msg="StopPodSandbox for \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" returns successfully" Feb 13 16:05:20.795538 containerd[1557]: time="2025-02-13T16:05:20.795424151Z" level=info msg="RemovePodSandbox for \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\"" Feb 13 16:05:20.795538 containerd[1557]: time="2025-02-13T16:05:20.795436078Z" level=info msg="Forcibly stopping sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\"" Feb 13 16:05:20.795538 containerd[1557]: time="2025-02-13T16:05:20.795508342Z" level=info msg="TearDown network for sandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" successfully" Feb 13 16:05:20.799070 containerd[1557]: time="2025-02-13T16:05:20.797384280Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.799070 containerd[1557]: time="2025-02-13T16:05:20.797415950Z" level=info msg="RemovePodSandbox \"fb53a222f92495d720bb409838dfd8a59494b0acca8d199b1abee1c29f539a33\" returns successfully" Feb 13 16:05:20.799696 containerd[1557]: time="2025-02-13T16:05:20.799139425Z" level=info msg="StopPodSandbox for \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\"" Feb 13 16:05:20.799696 containerd[1557]: time="2025-02-13T16:05:20.799201200Z" level=info msg="TearDown network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\" successfully" Feb 13 16:05:20.799696 containerd[1557]: time="2025-02-13T16:05:20.799208863Z" level=info msg="StopPodSandbox for \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\" returns successfully" Feb 13 16:05:20.803976 containerd[1557]: time="2025-02-13T16:05:20.803886375Z" level=info msg="RemovePodSandbox for \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\"" Feb 13 16:05:20.803976 containerd[1557]: time="2025-02-13T16:05:20.803900974Z" level=info msg="Forcibly stopping sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\"" Feb 13 16:05:20.803976 containerd[1557]: time="2025-02-13T16:05:20.803933870Z" level=info msg="TearDown network for sandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\" successfully" Feb 13 16:05:20.805011 containerd[1557]: time="2025-02-13T16:05:20.804995397Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.805041 containerd[1557]: time="2025-02-13T16:05:20.805030222Z" level=info msg="RemovePodSandbox \"05783b16de88b79e2a92997a736c0483cb68f2129b16c1619fd7e15813ee60c7\" returns successfully" Feb 13 16:05:20.805269 containerd[1557]: time="2025-02-13T16:05:20.805187026Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:05:20.805269 containerd[1557]: time="2025-02-13T16:05:20.805226155Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:05:20.805269 containerd[1557]: time="2025-02-13T16:05:20.805246302Z" level=info msg="StopPodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:05:20.805569 containerd[1557]: time="2025-02-13T16:05:20.805443824Z" level=info msg="RemovePodSandbox for \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:05:20.805569 containerd[1557]: time="2025-02-13T16:05:20.805498855Z" level=info msg="Forcibly stopping sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\"" Feb 13 16:05:20.805569 containerd[1557]: time="2025-02-13T16:05:20.805540371Z" level=info msg="TearDown network for sandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" successfully" Feb 13 16:05:20.806837 containerd[1557]: time="2025-02-13T16:05:20.806825794Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.806895 containerd[1557]: time="2025-02-13T16:05:20.806885965Z" level=info msg="RemovePodSandbox \"7e51ce5beb19f3a37ca15783b8e5863c7d24fb72baa8707a521a36390e1c489f\" returns successfully" Feb 13 16:05:20.807074 containerd[1557]: time="2025-02-13T16:05:20.807065201Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:05:20.807146 containerd[1557]: time="2025-02-13T16:05:20.807138099Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:05:20.807186 containerd[1557]: time="2025-02-13T16:05:20.807179434Z" level=info msg="StopPodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:05:20.807337 containerd[1557]: time="2025-02-13T16:05:20.807328056Z" level=info msg="RemovePodSandbox for \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:05:20.807400 containerd[1557]: time="2025-02-13T16:05:20.807389771Z" level=info msg="Forcibly stopping sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\"" Feb 13 16:05:20.807471 containerd[1557]: time="2025-02-13T16:05:20.807454995Z" level=info msg="TearDown network for sandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" successfully" Feb 13 16:05:20.808620 containerd[1557]: time="2025-02-13T16:05:20.808572534Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.808620 containerd[1557]: time="2025-02-13T16:05:20.808591146Z" level=info msg="RemovePodSandbox \"abf9d3d110059b25293d12fdbca25775337454d1414d80afc1e36a82e27c4420\" returns successfully" Feb 13 16:05:20.808795 containerd[1557]: time="2025-02-13T16:05:20.808730812Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:05:20.808795 containerd[1557]: time="2025-02-13T16:05:20.808768464Z" level=info msg="TearDown network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" successfully" Feb 13 16:05:20.808795 containerd[1557]: time="2025-02-13T16:05:20.808774105Z" level=info msg="StopPodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" returns successfully" Feb 13 16:05:20.809653 containerd[1557]: time="2025-02-13T16:05:20.809018656Z" level=info msg="RemovePodSandbox for \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:05:20.809653 containerd[1557]: time="2025-02-13T16:05:20.809029441Z" level=info msg="Forcibly stopping sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\"" Feb 13 16:05:20.809653 containerd[1557]: time="2025-02-13T16:05:20.809058502Z" level=info msg="TearDown network for sandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" successfully" Feb 13 16:05:20.810141 containerd[1557]: time="2025-02-13T16:05:20.810126906Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.810172 containerd[1557]: time="2025-02-13T16:05:20.810147564Z" level=info msg="RemovePodSandbox \"27ada4fc282d322e8c034cc4d4459457ad5d60fd21d3fa3dc837004faea69a36\" returns successfully" Feb 13 16:05:20.810399 containerd[1557]: time="2025-02-13T16:05:20.810316175Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" Feb 13 16:05:20.810399 containerd[1557]: time="2025-02-13T16:05:20.810361184Z" level=info msg="TearDown network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" successfully" Feb 13 16:05:20.810399 containerd[1557]: time="2025-02-13T16:05:20.810368016Z" level=info msg="StopPodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" returns successfully" Feb 13 16:05:20.810492 containerd[1557]: time="2025-02-13T16:05:20.810477457Z" level=info msg="RemovePodSandbox for \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" Feb 13 16:05:20.810519 containerd[1557]: time="2025-02-13T16:05:20.810491991Z" level=info msg="Forcibly stopping sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\"" Feb 13 16:05:20.810551 containerd[1557]: time="2025-02-13T16:05:20.810525441Z" level=info msg="TearDown network for sandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" successfully" Feb 13 16:05:20.811580 containerd[1557]: time="2025-02-13T16:05:20.811565697Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.811619 containerd[1557]: time="2025-02-13T16:05:20.811588254Z" level=info msg="RemovePodSandbox \"dd91c298e1a21d44e311393316dded71337cef30135529ee1070fc1b06544870\" returns successfully" Feb 13 16:05:20.811733 containerd[1557]: time="2025-02-13T16:05:20.811720180Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\"" Feb 13 16:05:20.812237 containerd[1557]: time="2025-02-13T16:05:20.811759960Z" level=info msg="TearDown network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" successfully" Feb 13 16:05:20.812237 containerd[1557]: time="2025-02-13T16:05:20.811779511Z" level=info msg="StopPodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" returns successfully" Feb 13 16:05:20.812237 containerd[1557]: time="2025-02-13T16:05:20.811893012Z" level=info msg="RemovePodSandbox for \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\"" Feb 13 16:05:20.812237 containerd[1557]: time="2025-02-13T16:05:20.811903345Z" level=info msg="Forcibly stopping sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\"" Feb 13 16:05:20.812237 containerd[1557]: time="2025-02-13T16:05:20.811930421Z" level=info msg="TearDown network for sandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" successfully" Feb 13 16:05:20.813057 containerd[1557]: time="2025-02-13T16:05:20.813043413Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.813125 containerd[1557]: time="2025-02-13T16:05:20.813063719Z" level=info msg="RemovePodSandbox \"5fc5ca843cf5a43a1997d74c6168ae7f83a579b7cbfebe965df5e77f6d83fd8f\" returns successfully" Feb 13 16:05:20.813282 containerd[1557]: time="2025-02-13T16:05:20.813200781Z" level=info msg="StopPodSandbox for \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\"" Feb 13 16:05:20.813282 containerd[1557]: time="2025-02-13T16:05:20.813249830Z" level=info msg="TearDown network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" successfully" Feb 13 16:05:20.813282 containerd[1557]: time="2025-02-13T16:05:20.813257998Z" level=info msg="StopPodSandbox for \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" returns successfully" Feb 13 16:05:20.813922 containerd[1557]: time="2025-02-13T16:05:20.813508459Z" level=info msg="RemovePodSandbox for \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\"" Feb 13 16:05:20.813922 containerd[1557]: time="2025-02-13T16:05:20.813528946Z" level=info msg="Forcibly stopping sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\"" Feb 13 16:05:20.813922 containerd[1557]: time="2025-02-13T16:05:20.813560850Z" level=info msg="TearDown network for sandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" successfully" Feb 13 16:05:20.814648 containerd[1557]: time="2025-02-13T16:05:20.814624567Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.814673 containerd[1557]: time="2025-02-13T16:05:20.814662536Z" level=info msg="RemovePodSandbox \"32eead22031a344a8a4882e05c974c6fe591c461f53f2dae21df3aae64c1395b\" returns successfully" Feb 13 16:05:20.814889 containerd[1557]: time="2025-02-13T16:05:20.814829230Z" level=info msg="StopPodSandbox for \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\"" Feb 13 16:05:20.814979 containerd[1557]: time="2025-02-13T16:05:20.814934171Z" level=info msg="TearDown network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\" successfully" Feb 13 16:05:20.814979 containerd[1557]: time="2025-02-13T16:05:20.814943107Z" level=info msg="StopPodSandbox for \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\" returns successfully" Feb 13 16:05:20.815612 containerd[1557]: time="2025-02-13T16:05:20.815098510Z" level=info msg="RemovePodSandbox for \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\"" Feb 13 16:05:20.815612 containerd[1557]: time="2025-02-13T16:05:20.815109989Z" level=info msg="Forcibly stopping sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\"" Feb 13 16:05:20.815612 containerd[1557]: time="2025-02-13T16:05:20.815145226Z" level=info msg="TearDown network for sandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\" successfully" Feb 13 16:05:20.816198 containerd[1557]: time="2025-02-13T16:05:20.816182127Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.816231 containerd[1557]: time="2025-02-13T16:05:20.816203243Z" level=info msg="RemovePodSandbox \"4d1bc91c525450a78d86aea2022154b090511e54545729af6c92de994528f5e0\" returns successfully" Feb 13 16:05:20.816414 containerd[1557]: time="2025-02-13T16:05:20.816366442Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:05:20.816561 containerd[1557]: time="2025-02-13T16:05:20.816529814Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:05:20.816561 containerd[1557]: time="2025-02-13T16:05:20.816538644Z" level=info msg="StopPodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:05:20.816839 containerd[1557]: time="2025-02-13T16:05:20.816810653Z" level=info msg="RemovePodSandbox for \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:05:20.816839 containerd[1557]: time="2025-02-13T16:05:20.816838865Z" level=info msg="Forcibly stopping sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\"" Feb 13 16:05:20.816916 containerd[1557]: time="2025-02-13T16:05:20.816881123Z" level=info msg="TearDown network for sandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" successfully" Feb 13 16:05:20.817960 containerd[1557]: time="2025-02-13T16:05:20.817945592Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.818001 containerd[1557]: time="2025-02-13T16:05:20.817965817Z" level=info msg="RemovePodSandbox \"b7d84b132c1d8eca7a7f689f157930bf259c508b4ace6bc8fb441a82e7652343\" returns successfully" Feb 13 16:05:20.818133 containerd[1557]: time="2025-02-13T16:05:20.818113128Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:05:20.818171 containerd[1557]: time="2025-02-13T16:05:20.818160497Z" level=info msg="TearDown network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" successfully" Feb 13 16:05:20.818194 containerd[1557]: time="2025-02-13T16:05:20.818170065Z" level=info msg="StopPodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" returns successfully" Feb 13 16:05:20.819028 containerd[1557]: time="2025-02-13T16:05:20.818303554Z" level=info msg="RemovePodSandbox for \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:05:20.819028 containerd[1557]: time="2025-02-13T16:05:20.818315535Z" level=info msg="Forcibly stopping sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\"" Feb 13 16:05:20.819028 containerd[1557]: time="2025-02-13T16:05:20.818346803Z" level=info msg="TearDown network for sandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" successfully" Feb 13 16:05:20.819433 containerd[1557]: time="2025-02-13T16:05:20.819420894Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.819484 containerd[1557]: time="2025-02-13T16:05:20.819476089Z" level=info msg="RemovePodSandbox \"00afa653f4b559f3a795254b6440da24629bf1951c3bfd7fb1b5c764875037d7\" returns successfully" Feb 13 16:05:20.819699 containerd[1557]: time="2025-02-13T16:05:20.819686042Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" Feb 13 16:05:20.819748 containerd[1557]: time="2025-02-13T16:05:20.819733899Z" level=info msg="TearDown network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" successfully" Feb 13 16:05:20.819748 containerd[1557]: time="2025-02-13T16:05:20.819744899Z" level=info msg="StopPodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" returns successfully" Feb 13 16:05:20.819888 containerd[1557]: time="2025-02-13T16:05:20.819874782Z" level=info msg="RemovePodSandbox for \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" Feb 13 16:05:20.819923 containerd[1557]: time="2025-02-13T16:05:20.819907153Z" level=info msg="Forcibly stopping sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\"" Feb 13 16:05:20.819965 containerd[1557]: time="2025-02-13T16:05:20.819935635Z" level=info msg="TearDown network for sandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" successfully" Feb 13 16:05:20.821048 containerd[1557]: time="2025-02-13T16:05:20.821015924Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.821048 containerd[1557]: time="2025-02-13T16:05:20.821036705Z" level=info msg="RemovePodSandbox \"e2a9df741cc516a7a0e3c6011062aa6b5d210422e3d373eb29ee314c38b73665\" returns successfully" Feb 13 16:05:20.821191 containerd[1557]: time="2025-02-13T16:05:20.821144295Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\"" Feb 13 16:05:20.821216 containerd[1557]: time="2025-02-13T16:05:20.821194069Z" level=info msg="TearDown network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" successfully" Feb 13 16:05:20.821216 containerd[1557]: time="2025-02-13T16:05:20.821200853Z" level=info msg="StopPodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" returns successfully" Feb 13 16:05:20.821589 containerd[1557]: time="2025-02-13T16:05:20.821435507Z" level=info msg="RemovePodSandbox for \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\"" Feb 13 16:05:20.821589 containerd[1557]: time="2025-02-13T16:05:20.821446643Z" level=info msg="Forcibly stopping sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\"" Feb 13 16:05:20.821589 containerd[1557]: time="2025-02-13T16:05:20.821481549Z" level=info msg="TearDown network for sandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" successfully" Feb 13 16:05:20.822505 containerd[1557]: time="2025-02-13T16:05:20.822489462Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.822532 containerd[1557]: time="2025-02-13T16:05:20.822510461Z" level=info msg="RemovePodSandbox \"56d1515fc26441d42b440c23251a32058ce59b3cc031939895918847677acabb\" returns successfully" Feb 13 16:05:20.822799 containerd[1557]: time="2025-02-13T16:05:20.822696845Z" level=info msg="StopPodSandbox for \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\"" Feb 13 16:05:20.822799 containerd[1557]: time="2025-02-13T16:05:20.822740995Z" level=info msg="TearDown network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" successfully" Feb 13 16:05:20.822799 containerd[1557]: time="2025-02-13T16:05:20.822747228Z" level=info msg="StopPodSandbox for \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" returns successfully" Feb 13 16:05:20.823036 containerd[1557]: time="2025-02-13T16:05:20.823021891Z" level=info msg="RemovePodSandbox for \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\"" Feb 13 16:05:20.823867 containerd[1557]: time="2025-02-13T16:05:20.823040695Z" level=info msg="Forcibly stopping sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\"" Feb 13 16:05:20.823867 containerd[1557]: time="2025-02-13T16:05:20.823077218Z" level=info msg="TearDown network for sandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" successfully" Feb 13 16:05:20.824285 containerd[1557]: time="2025-02-13T16:05:20.824271195Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.824341 containerd[1557]: time="2025-02-13T16:05:20.824292026Z" level=info msg="RemovePodSandbox \"a43d1fc591dcc58183f009483faf31bc1bc832651f2ca0038316e649ac63ac33\" returns successfully" Feb 13 16:05:20.824570 containerd[1557]: time="2025-02-13T16:05:20.824498067Z" level=info msg="StopPodSandbox for \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\"" Feb 13 16:05:20.824570 containerd[1557]: time="2025-02-13T16:05:20.824536365Z" level=info msg="TearDown network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\" successfully" Feb 13 16:05:20.824570 containerd[1557]: time="2025-02-13T16:05:20.824542616Z" level=info msg="StopPodSandbox for \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\" returns successfully" Feb 13 16:05:20.826608 containerd[1557]: time="2025-02-13T16:05:20.824797127Z" level=info msg="RemovePodSandbox for \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\"" Feb 13 16:05:20.826608 containerd[1557]: time="2025-02-13T16:05:20.824807596Z" level=info msg="Forcibly stopping sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\"" Feb 13 16:05:20.826608 containerd[1557]: time="2025-02-13T16:05:20.824890379Z" level=info msg="TearDown network for sandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\" successfully" Feb 13 16:05:20.826608 containerd[1557]: time="2025-02-13T16:05:20.825991113Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 16:05:20.826608 containerd[1557]: time="2025-02-13T16:05:20.826008799Z" level=info msg="RemovePodSandbox \"b780ce9f832ac32b8a74750650b126d01ee54f094c8e9c54041d7fa505585945\" returns successfully" Feb 13 16:05:25.890773 systemd[1]: Started sshd@7-139.178.70.109:22-147.75.109.163:54856.service - OpenSSH per-connection server daemon (147.75.109.163:54856). Feb 13 16:05:25.969381 sshd[6012]: Accepted publickey for core from 147.75.109.163 port 54856 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:25.971729 sshd-session[6012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:25.978360 systemd-logind[1538]: New session 10 of user core. Feb 13 16:05:25.987812 systemd[1]: Started session-10.scope - Session 10 of User core. Feb 13 16:05:26.441829 sshd[6014]: Connection closed by 147.75.109.163 port 54856 Feb 13 16:05:26.442384 sshd-session[6012]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:26.445474 systemd[1]: sshd@7-139.178.70.109:22-147.75.109.163:54856.service: Deactivated successfully. Feb 13 16:05:26.446615 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 16:05:26.447316 systemd-logind[1538]: Session 10 logged out. Waiting for processes to exit. Feb 13 16:05:26.448198 systemd-logind[1538]: Removed session 10. Feb 13 16:05:31.453659 systemd[1]: Started sshd@8-139.178.70.109:22-147.75.109.163:49000.service - OpenSSH per-connection server daemon (147.75.109.163:49000). Feb 13 16:05:31.510956 sshd[6070]: Accepted publickey for core from 147.75.109.163 port 49000 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:31.512691 sshd-session[6070]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:31.518088 systemd-logind[1538]: New session 11 of user core. Feb 13 16:05:31.521574 systemd[1]: Started session-11.scope - Session 11 of User core. Feb 13 16:05:31.658961 sshd[6089]: Connection closed by 147.75.109.163 port 49000 Feb 13 16:05:31.659490 sshd-session[6070]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:31.663997 systemd-logind[1538]: Session 11 logged out. Waiting for processes to exit. Feb 13 16:05:31.664076 systemd[1]: sshd@8-139.178.70.109:22-147.75.109.163:49000.service: Deactivated successfully. Feb 13 16:05:31.665283 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 16:05:31.665853 systemd-logind[1538]: Removed session 11. Feb 13 16:05:36.668550 systemd[1]: Started sshd@9-139.178.70.109:22-147.75.109.163:49006.service - OpenSSH per-connection server daemon (147.75.109.163:49006). Feb 13 16:05:36.760260 sshd[6114]: Accepted publickey for core from 147.75.109.163 port 49006 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:36.762997 sshd-session[6114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:36.766879 systemd-logind[1538]: New session 12 of user core. Feb 13 16:05:36.773730 systemd[1]: Started session-12.scope - Session 12 of User core. Feb 13 16:05:36.871660 sshd[6116]: Connection closed by 147.75.109.163 port 49006 Feb 13 16:05:36.871986 sshd-session[6114]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:36.879082 systemd[1]: sshd@9-139.178.70.109:22-147.75.109.163:49006.service: Deactivated successfully. Feb 13 16:05:36.879992 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 16:05:36.880797 systemd-logind[1538]: Session 12 logged out. Waiting for processes to exit. Feb 13 16:05:36.884788 systemd[1]: Started sshd@10-139.178.70.109:22-147.75.109.163:49020.service - OpenSSH per-connection server daemon (147.75.109.163:49020). Feb 13 16:05:36.885811 systemd-logind[1538]: Removed session 12. Feb 13 16:05:36.914755 sshd[6128]: Accepted publickey for core from 147.75.109.163 port 49020 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:36.915583 sshd-session[6128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:36.918785 systemd-logind[1538]: New session 13 of user core. Feb 13 16:05:36.926744 systemd[1]: Started session-13.scope - Session 13 of User core. Feb 13 16:05:37.174805 sshd[6131]: Connection closed by 147.75.109.163 port 49020 Feb 13 16:05:37.180845 sshd-session[6128]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:37.192574 systemd[1]: Started sshd@11-139.178.70.109:22-147.75.109.163:49028.service - OpenSSH per-connection server daemon (147.75.109.163:49028). Feb 13 16:05:37.192943 systemd[1]: sshd@10-139.178.70.109:22-147.75.109.163:49020.service: Deactivated successfully. Feb 13 16:05:37.194188 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 16:05:37.201169 systemd-logind[1538]: Session 13 logged out. Waiting for processes to exit. Feb 13 16:05:37.203833 systemd-logind[1538]: Removed session 13. Feb 13 16:05:37.253029 sshd[6138]: Accepted publickey for core from 147.75.109.163 port 49028 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:37.254488 sshd-session[6138]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:37.258552 systemd-logind[1538]: New session 14 of user core. Feb 13 16:05:37.263738 systemd[1]: Started session-14.scope - Session 14 of User core. Feb 13 16:05:37.378677 sshd[6143]: Connection closed by 147.75.109.163 port 49028 Feb 13 16:05:37.379026 sshd-session[6138]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:37.381064 systemd[1]: sshd@11-139.178.70.109:22-147.75.109.163:49028.service: Deactivated successfully. Feb 13 16:05:37.382110 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 16:05:37.382528 systemd-logind[1538]: Session 14 logged out. Waiting for processes to exit. Feb 13 16:05:37.383181 systemd-logind[1538]: Removed session 14. Feb 13 16:05:42.389236 systemd[1]: Started sshd@12-139.178.70.109:22-147.75.109.163:41456.service - OpenSSH per-connection server daemon (147.75.109.163:41456). Feb 13 16:05:42.423466 sshd[6154]: Accepted publickey for core from 147.75.109.163 port 41456 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:42.424452 sshd-session[6154]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:42.428677 systemd-logind[1538]: New session 15 of user core. Feb 13 16:05:42.437762 systemd[1]: Started session-15.scope - Session 15 of User core. Feb 13 16:05:42.641754 sshd[6156]: Connection closed by 147.75.109.163 port 41456 Feb 13 16:05:42.642263 sshd-session[6154]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:42.650783 systemd[1]: sshd@12-139.178.70.109:22-147.75.109.163:41456.service: Deactivated successfully. Feb 13 16:05:42.652184 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 16:05:42.653209 systemd-logind[1538]: Session 15 logged out. Waiting for processes to exit. Feb 13 16:05:42.653842 systemd-logind[1538]: Removed session 15. Feb 13 16:05:47.644825 systemd[1]: Started sshd@13-139.178.70.109:22-147.75.109.163:41466.service - OpenSSH per-connection server daemon (147.75.109.163:41466). Feb 13 16:05:47.739089 sshd[6172]: Accepted publickey for core from 147.75.109.163 port 41466 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:47.739897 sshd-session[6172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:47.743344 systemd-logind[1538]: New session 16 of user core. Feb 13 16:05:47.745756 systemd[1]: Started session-16.scope - Session 16 of User core. Feb 13 16:05:48.002220 sshd[6174]: Connection closed by 147.75.109.163 port 41466 Feb 13 16:05:48.002908 sshd-session[6172]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:48.004465 systemd-logind[1538]: Session 16 logged out. Waiting for processes to exit. Feb 13 16:05:48.004738 systemd[1]: sshd@13-139.178.70.109:22-147.75.109.163:41466.service: Deactivated successfully. Feb 13 16:05:48.006125 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 16:05:48.007275 systemd-logind[1538]: Removed session 16. Feb 13 16:05:53.013371 systemd[1]: Started sshd@14-139.178.70.109:22-147.75.109.163:60792.service - OpenSSH per-connection server daemon (147.75.109.163:60792). Feb 13 16:05:53.059294 sshd[6190]: Accepted publickey for core from 147.75.109.163 port 60792 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:53.059987 sshd-session[6190]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:53.062501 systemd-logind[1538]: New session 17 of user core. Feb 13 16:05:53.072726 systemd[1]: Started session-17.scope - Session 17 of User core. Feb 13 16:05:53.357060 sshd[6192]: Connection closed by 147.75.109.163 port 60792 Feb 13 16:05:53.360156 sshd-session[6190]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:53.367171 systemd[1]: Started sshd@15-139.178.70.109:22-147.75.109.163:60794.service - OpenSSH per-connection server daemon (147.75.109.163:60794). Feb 13 16:05:53.367425 systemd[1]: sshd@14-139.178.70.109:22-147.75.109.163:60792.service: Deactivated successfully. Feb 13 16:05:53.368352 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 16:05:53.369836 systemd-logind[1538]: Session 17 logged out. Waiting for processes to exit. Feb 13 16:05:53.371831 systemd-logind[1538]: Removed session 17. Feb 13 16:05:53.426691 sshd[6200]: Accepted publickey for core from 147.75.109.163 port 60794 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:53.427501 sshd-session[6200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:53.430761 systemd-logind[1538]: New session 18 of user core. Feb 13 16:05:53.440728 systemd[1]: Started session-18.scope - Session 18 of User core. Feb 13 16:05:53.802201 sshd[6205]: Connection closed by 147.75.109.163 port 60794 Feb 13 16:05:53.802559 sshd-session[6200]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:53.810028 systemd[1]: Started sshd@16-139.178.70.109:22-147.75.109.163:60796.service - OpenSSH per-connection server daemon (147.75.109.163:60796). Feb 13 16:05:53.812193 systemd[1]: sshd@15-139.178.70.109:22-147.75.109.163:60794.service: Deactivated successfully. Feb 13 16:05:53.813192 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 16:05:53.814894 systemd-logind[1538]: Session 18 logged out. Waiting for processes to exit. Feb 13 16:05:53.816905 systemd-logind[1538]: Removed session 18. Feb 13 16:05:53.854693 sshd[6212]: Accepted publickey for core from 147.75.109.163 port 60796 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:53.855405 sshd-session[6212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:53.858531 systemd-logind[1538]: New session 19 of user core. Feb 13 16:05:53.861719 systemd[1]: Started session-19.scope - Session 19 of User core. Feb 13 16:05:54.489967 sshd[6217]: Connection closed by 147.75.109.163 port 60796 Feb 13 16:05:54.490369 sshd-session[6212]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:54.502943 systemd[1]: Started sshd@17-139.178.70.109:22-147.75.109.163:60810.service - OpenSSH per-connection server daemon (147.75.109.163:60810). Feb 13 16:05:54.503231 systemd[1]: sshd@16-139.178.70.109:22-147.75.109.163:60796.service: Deactivated successfully. Feb 13 16:05:54.504272 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 16:05:54.508282 systemd-logind[1538]: Session 19 logged out. Waiting for processes to exit. Feb 13 16:05:54.509309 systemd-logind[1538]: Removed session 19. Feb 13 16:05:54.561523 sshd[6232]: Accepted publickey for core from 147.75.109.163 port 60810 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:54.562764 sshd-session[6232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:54.566079 systemd-logind[1538]: New session 20 of user core. Feb 13 16:05:54.569726 systemd[1]: Started session-20.scope - Session 20 of User core. Feb 13 16:05:54.790231 sshd[6237]: Connection closed by 147.75.109.163 port 60810 Feb 13 16:05:54.791337 sshd-session[6232]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:54.804918 systemd[1]: Started sshd@18-139.178.70.109:22-147.75.109.163:60820.service - OpenSSH per-connection server daemon (147.75.109.163:60820). Feb 13 16:05:54.805244 systemd[1]: sshd@17-139.178.70.109:22-147.75.109.163:60810.service: Deactivated successfully. Feb 13 16:05:54.806478 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 16:05:54.807903 systemd-logind[1538]: Session 20 logged out. Waiting for processes to exit. Feb 13 16:05:54.809039 systemd-logind[1538]: Removed session 20. Feb 13 16:05:54.849627 sshd[6245]: Accepted publickey for core from 147.75.109.163 port 60820 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:54.850487 sshd-session[6245]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:54.853968 systemd-logind[1538]: New session 21 of user core. Feb 13 16:05:54.857748 systemd[1]: Started session-21.scope - Session 21 of User core. Feb 13 16:05:54.949248 sshd[6250]: Connection closed by 147.75.109.163 port 60820 Feb 13 16:05:54.949608 sshd-session[6245]: pam_unix(sshd:session): session closed for user core Feb 13 16:05:54.951349 systemd-logind[1538]: Session 21 logged out. Waiting for processes to exit. Feb 13 16:05:54.951528 systemd[1]: sshd@18-139.178.70.109:22-147.75.109.163:60820.service: Deactivated successfully. Feb 13 16:05:54.952622 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 16:05:54.953529 systemd-logind[1538]: Removed session 21. Feb 13 16:05:56.859852 systemd[1]: run-containerd-runc-k8s.io-50e9a5492b452bb6747758b5b0a07d43fca89d8bafd7f417f6b8e17ccf8f3e47-runc.r3pXmg.mount: Deactivated successfully. Feb 13 16:05:59.960667 systemd[1]: Started sshd@19-139.178.70.109:22-147.75.109.163:56884.service - OpenSSH per-connection server daemon (147.75.109.163:56884). Feb 13 16:05:59.994624 sshd[6288]: Accepted publickey for core from 147.75.109.163 port 56884 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:05:59.995506 sshd-session[6288]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:05:59.998748 systemd-logind[1538]: New session 22 of user core. Feb 13 16:06:00.003814 systemd[1]: Started session-22.scope - Session 22 of User core. Feb 13 16:06:00.111545 sshd[6290]: Connection closed by 147.75.109.163 port 56884 Feb 13 16:06:00.112170 sshd-session[6288]: pam_unix(sshd:session): session closed for user core Feb 13 16:06:00.114278 systemd-logind[1538]: Session 22 logged out. Waiting for processes to exit. Feb 13 16:06:00.114353 systemd[1]: sshd@19-139.178.70.109:22-147.75.109.163:56884.service: Deactivated successfully. Feb 13 16:06:00.115744 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 16:06:00.117023 systemd-logind[1538]: Removed session 22. Feb 13 16:06:05.122786 systemd[1]: Started sshd@20-139.178.70.109:22-147.75.109.163:56894.service - OpenSSH per-connection server daemon (147.75.109.163:56894). Feb 13 16:06:05.187679 sshd[6324]: Accepted publickey for core from 147.75.109.163 port 56894 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:06:05.189267 sshd-session[6324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:06:05.193251 systemd-logind[1538]: New session 23 of user core. Feb 13 16:06:05.201741 systemd[1]: Started session-23.scope - Session 23 of User core. Feb 13 16:06:05.349624 sshd[6326]: Connection closed by 147.75.109.163 port 56894 Feb 13 16:06:05.349247 sshd-session[6324]: pam_unix(sshd:session): session closed for user core Feb 13 16:06:05.350943 systemd-logind[1538]: Session 23 logged out. Waiting for processes to exit. Feb 13 16:06:05.351035 systemd[1]: sshd@20-139.178.70.109:22-147.75.109.163:56894.service: Deactivated successfully. Feb 13 16:06:05.352137 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 16:06:05.353213 systemd-logind[1538]: Removed session 23. Feb 13 16:06:10.368834 systemd[1]: Started sshd@21-139.178.70.109:22-147.75.109.163:53634.service - OpenSSH per-connection server daemon (147.75.109.163:53634). Feb 13 16:06:10.400098 sshd[6341]: Accepted publickey for core from 147.75.109.163 port 53634 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 16:06:10.400864 sshd-session[6341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 16:06:10.403437 systemd-logind[1538]: New session 24 of user core. Feb 13 16:06:10.409719 systemd[1]: Started session-24.scope - Session 24 of User core. Feb 13 16:06:10.505577 sshd[6343]: Connection closed by 147.75.109.163 port 53634 Feb 13 16:06:10.506077 sshd-session[6341]: pam_unix(sshd:session): session closed for user core Feb 13 16:06:10.508058 systemd[1]: sshd@21-139.178.70.109:22-147.75.109.163:53634.service: Deactivated successfully. Feb 13 16:06:10.509081 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 16:06:10.509505 systemd-logind[1538]: Session 24 logged out. Waiting for processes to exit. Feb 13 16:06:10.510186 systemd-logind[1538]: Removed session 24.