Feb 13 15:55:02.756934 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.43 p3) 2.43.1) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 14:00:20 -00 2025 Feb 13 15:55:02.756952 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 15:55:02.756958 kernel: Disabled fast string operations Feb 13 15:55:02.756963 kernel: BIOS-provided physical RAM map: Feb 13 15:55:02.756966 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Feb 13 15:55:02.756971 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Feb 13 15:55:02.756977 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Feb 13 15:55:02.756981 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Feb 13 15:55:02.756985 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Feb 13 15:55:02.756990 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Feb 13 15:55:02.756994 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Feb 13 15:55:02.756998 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Feb 13 15:55:02.757003 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Feb 13 15:55:02.757007 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 15:55:02.757013 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Feb 13 15:55:02.757018 kernel: NX (Execute Disable) protection: active Feb 13 15:55:02.757023 kernel: APIC: Static calls initialized Feb 13 15:55:02.757028 kernel: SMBIOS 2.7 present. Feb 13 15:55:02.757033 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Feb 13 15:55:02.757038 kernel: vmware: hypercall mode: 0x00 Feb 13 15:55:02.757043 kernel: Hypervisor detected: VMware Feb 13 15:55:02.757047 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Feb 13 15:55:02.757053 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Feb 13 15:55:02.757058 kernel: vmware: using clock offset of 3014030388 ns Feb 13 15:55:02.757063 kernel: tsc: Detected 3408.000 MHz processor Feb 13 15:55:02.757068 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 15:55:02.757073 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 15:55:02.757078 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Feb 13 15:55:02.757083 kernel: total RAM covered: 3072M Feb 13 15:55:02.757088 kernel: Found optimal setting for mtrr clean up Feb 13 15:55:02.757094 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Feb 13 15:55:02.757099 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Feb 13 15:55:02.757105 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 15:55:02.757110 kernel: Using GB pages for direct mapping Feb 13 15:55:02.757115 kernel: ACPI: Early table checksum verification disabled Feb 13 15:55:02.757119 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Feb 13 15:55:02.757124 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Feb 13 15:55:02.757129 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Feb 13 15:55:02.757134 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Feb 13 15:55:02.757139 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 15:55:02.757147 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 15:55:02.757152 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Feb 13 15:55:02.757157 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Feb 13 15:55:02.757163 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Feb 13 15:55:02.757168 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Feb 13 15:55:02.757173 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Feb 13 15:55:02.757194 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Feb 13 15:55:02.757200 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Feb 13 15:55:02.757206 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Feb 13 15:55:02.757211 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 15:55:02.757216 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 15:55:02.757221 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Feb 13 15:55:02.757226 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Feb 13 15:55:02.757231 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Feb 13 15:55:02.757236 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Feb 13 15:55:02.757244 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Feb 13 15:55:02.757249 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Feb 13 15:55:02.757254 kernel: system APIC only can use physical flat Feb 13 15:55:02.757259 kernel: APIC: Switched APIC routing to: physical flat Feb 13 15:55:02.757264 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 15:55:02.757269 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Feb 13 15:55:02.757275 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Feb 13 15:55:02.757279 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Feb 13 15:55:02.757285 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Feb 13 15:55:02.757289 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Feb 13 15:55:02.757296 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Feb 13 15:55:02.757301 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Feb 13 15:55:02.757306 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Feb 13 15:55:02.757311 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Feb 13 15:55:02.757316 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Feb 13 15:55:02.757321 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Feb 13 15:55:02.757326 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Feb 13 15:55:02.757331 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Feb 13 15:55:02.757336 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Feb 13 15:55:02.757341 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Feb 13 15:55:02.757348 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Feb 13 15:55:02.757353 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Feb 13 15:55:02.757358 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Feb 13 15:55:02.757363 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Feb 13 15:55:02.757367 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Feb 13 15:55:02.757372 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Feb 13 15:55:02.757377 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Feb 13 15:55:02.757382 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Feb 13 15:55:02.757388 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Feb 13 15:55:02.757392 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Feb 13 15:55:02.757399 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Feb 13 15:55:02.757404 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Feb 13 15:55:02.757409 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Feb 13 15:55:02.757414 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Feb 13 15:55:02.757419 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Feb 13 15:55:02.757424 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Feb 13 15:55:02.757429 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Feb 13 15:55:02.757434 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Feb 13 15:55:02.757439 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Feb 13 15:55:02.757444 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Feb 13 15:55:02.757450 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Feb 13 15:55:02.757455 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Feb 13 15:55:02.757460 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Feb 13 15:55:02.757465 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Feb 13 15:55:02.757470 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Feb 13 15:55:02.757475 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Feb 13 15:55:02.757480 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Feb 13 15:55:02.757485 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Feb 13 15:55:02.757491 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Feb 13 15:55:02.757496 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Feb 13 15:55:02.757501 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Feb 13 15:55:02.757507 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Feb 13 15:55:02.757511 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Feb 13 15:55:02.757516 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Feb 13 15:55:02.757521 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Feb 13 15:55:02.757526 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Feb 13 15:55:02.757531 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Feb 13 15:55:02.757536 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Feb 13 15:55:02.757541 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Feb 13 15:55:02.757546 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Feb 13 15:55:02.757552 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Feb 13 15:55:02.757557 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Feb 13 15:55:02.757563 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Feb 13 15:55:02.757572 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Feb 13 15:55:02.757578 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Feb 13 15:55:02.757583 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Feb 13 15:55:02.757588 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Feb 13 15:55:02.757594 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Feb 13 15:55:02.757599 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Feb 13 15:55:02.757605 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Feb 13 15:55:02.757611 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Feb 13 15:55:02.757616 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Feb 13 15:55:02.757621 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Feb 13 15:55:02.757627 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Feb 13 15:55:02.757632 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Feb 13 15:55:02.757637 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Feb 13 15:55:02.757642 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Feb 13 15:55:02.757648 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Feb 13 15:55:02.757653 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Feb 13 15:55:02.757659 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Feb 13 15:55:02.757665 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Feb 13 15:55:02.757670 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Feb 13 15:55:02.757675 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Feb 13 15:55:02.757681 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Feb 13 15:55:02.757686 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Feb 13 15:55:02.757692 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Feb 13 15:55:02.757697 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Feb 13 15:55:02.757703 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Feb 13 15:55:02.757708 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Feb 13 15:55:02.757714 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Feb 13 15:55:02.757719 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Feb 13 15:55:02.757725 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Feb 13 15:55:02.757730 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Feb 13 15:55:02.757735 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Feb 13 15:55:02.757741 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Feb 13 15:55:02.757746 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Feb 13 15:55:02.757752 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Feb 13 15:55:02.757757 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Feb 13 15:55:02.757762 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Feb 13 15:55:02.757768 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Feb 13 15:55:02.757774 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Feb 13 15:55:02.757779 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Feb 13 15:55:02.757784 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Feb 13 15:55:02.757790 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Feb 13 15:55:02.757795 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Feb 13 15:55:02.757800 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Feb 13 15:55:02.757805 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Feb 13 15:55:02.757811 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Feb 13 15:55:02.757816 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Feb 13 15:55:02.757823 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Feb 13 15:55:02.757828 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Feb 13 15:55:02.757834 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Feb 13 15:55:02.757841 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Feb 13 15:55:02.757850 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Feb 13 15:55:02.757865 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Feb 13 15:55:02.757873 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Feb 13 15:55:02.757878 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Feb 13 15:55:02.757884 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Feb 13 15:55:02.757889 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Feb 13 15:55:02.757894 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Feb 13 15:55:02.757901 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Feb 13 15:55:02.757906 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Feb 13 15:55:02.757912 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Feb 13 15:55:02.757917 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Feb 13 15:55:02.757922 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Feb 13 15:55:02.757928 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Feb 13 15:55:02.757933 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Feb 13 15:55:02.757938 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Feb 13 15:55:02.757944 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Feb 13 15:55:02.757949 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Feb 13 15:55:02.757955 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Feb 13 15:55:02.757961 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Feb 13 15:55:02.757966 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 15:55:02.757972 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 15:55:02.757977 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Feb 13 15:55:02.757983 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Feb 13 15:55:02.757988 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Feb 13 15:55:02.757994 kernel: Zone ranges: Feb 13 15:55:02.758000 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 15:55:02.758007 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Feb 13 15:55:02.758012 kernel: Normal empty Feb 13 15:55:02.758017 kernel: Movable zone start for each node Feb 13 15:55:02.758023 kernel: Early memory node ranges Feb 13 15:55:02.758028 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Feb 13 15:55:02.758037 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Feb 13 15:55:02.758047 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Feb 13 15:55:02.758058 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Feb 13 15:55:02.758068 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 15:55:02.758078 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Feb 13 15:55:02.758090 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Feb 13 15:55:02.758100 kernel: ACPI: PM-Timer IO Port: 0x1008 Feb 13 15:55:02.758116 kernel: system APIC only can use physical flat Feb 13 15:55:02.758125 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Feb 13 15:55:02.758135 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 15:55:02.758145 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 15:55:02.758155 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 15:55:02.758165 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 15:55:02.758174 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 15:55:02.758570 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 15:55:02.758576 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 15:55:02.758582 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 15:55:02.758587 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 15:55:02.758593 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 15:55:02.758598 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 15:55:02.758604 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 15:55:02.758610 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 15:55:02.758615 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 15:55:02.758621 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 15:55:02.758627 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 15:55:02.758633 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Feb 13 15:55:02.758638 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Feb 13 15:55:02.758644 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Feb 13 15:55:02.758649 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Feb 13 15:55:02.758655 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Feb 13 15:55:02.758660 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Feb 13 15:55:02.758666 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Feb 13 15:55:02.760189 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Feb 13 15:55:02.760203 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Feb 13 15:55:02.760210 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Feb 13 15:55:02.760216 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Feb 13 15:55:02.760221 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Feb 13 15:55:02.760227 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Feb 13 15:55:02.760232 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Feb 13 15:55:02.760238 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Feb 13 15:55:02.760243 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Feb 13 15:55:02.760249 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Feb 13 15:55:02.760254 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Feb 13 15:55:02.760261 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Feb 13 15:55:02.760267 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Feb 13 15:55:02.760272 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Feb 13 15:55:02.760278 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Feb 13 15:55:02.760283 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Feb 13 15:55:02.760288 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Feb 13 15:55:02.760294 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Feb 13 15:55:02.760300 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Feb 13 15:55:02.760305 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Feb 13 15:55:02.760311 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Feb 13 15:55:02.760317 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Feb 13 15:55:02.760323 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Feb 13 15:55:02.760328 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Feb 13 15:55:02.760334 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Feb 13 15:55:02.760339 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Feb 13 15:55:02.760345 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Feb 13 15:55:02.760350 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Feb 13 15:55:02.760356 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Feb 13 15:55:02.760361 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Feb 13 15:55:02.760368 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Feb 13 15:55:02.760374 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Feb 13 15:55:02.760379 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Feb 13 15:55:02.760385 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Feb 13 15:55:02.760390 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Feb 13 15:55:02.760395 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Feb 13 15:55:02.760401 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Feb 13 15:55:02.760407 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Feb 13 15:55:02.760412 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Feb 13 15:55:02.760417 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Feb 13 15:55:02.760424 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Feb 13 15:55:02.760430 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Feb 13 15:55:02.760435 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Feb 13 15:55:02.760440 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Feb 13 15:55:02.760446 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Feb 13 15:55:02.760451 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Feb 13 15:55:02.760457 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Feb 13 15:55:02.760462 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Feb 13 15:55:02.760468 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Feb 13 15:55:02.760474 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Feb 13 15:55:02.760480 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Feb 13 15:55:02.760486 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Feb 13 15:55:02.760491 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Feb 13 15:55:02.760497 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Feb 13 15:55:02.760502 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Feb 13 15:55:02.760508 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Feb 13 15:55:02.760513 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Feb 13 15:55:02.760519 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Feb 13 15:55:02.760524 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Feb 13 15:55:02.760529 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Feb 13 15:55:02.760536 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Feb 13 15:55:02.760541 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Feb 13 15:55:02.760547 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Feb 13 15:55:02.760552 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Feb 13 15:55:02.760558 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Feb 13 15:55:02.760563 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Feb 13 15:55:02.760568 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Feb 13 15:55:02.760574 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Feb 13 15:55:02.760579 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Feb 13 15:55:02.760586 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Feb 13 15:55:02.760591 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Feb 13 15:55:02.760597 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Feb 13 15:55:02.760602 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Feb 13 15:55:02.760608 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Feb 13 15:55:02.760613 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Feb 13 15:55:02.760618 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Feb 13 15:55:02.760624 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Feb 13 15:55:02.760629 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Feb 13 15:55:02.760635 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Feb 13 15:55:02.760641 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Feb 13 15:55:02.760647 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Feb 13 15:55:02.760652 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Feb 13 15:55:02.760658 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Feb 13 15:55:02.760664 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Feb 13 15:55:02.760669 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Feb 13 15:55:02.760675 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Feb 13 15:55:02.760680 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Feb 13 15:55:02.760686 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Feb 13 15:55:02.760692 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Feb 13 15:55:02.760698 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Feb 13 15:55:02.760703 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Feb 13 15:55:02.760708 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Feb 13 15:55:02.760714 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Feb 13 15:55:02.760719 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Feb 13 15:55:02.760725 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Feb 13 15:55:02.760730 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Feb 13 15:55:02.760735 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Feb 13 15:55:02.760741 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Feb 13 15:55:02.760748 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Feb 13 15:55:02.760753 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Feb 13 15:55:02.760759 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Feb 13 15:55:02.760764 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Feb 13 15:55:02.760770 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Feb 13 15:55:02.760775 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Feb 13 15:55:02.760781 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Feb 13 15:55:02.760786 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Feb 13 15:55:02.760792 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 15:55:02.760799 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Feb 13 15:55:02.760805 kernel: TSC deadline timer available Feb 13 15:55:02.760810 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Feb 13 15:55:02.760815 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Feb 13 15:55:02.760821 kernel: Booting paravirtualized kernel on VMware hypervisor Feb 13 15:55:02.760827 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 15:55:02.760832 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Feb 13 15:55:02.760841 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Feb 13 15:55:02.760850 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Feb 13 15:55:02.760857 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Feb 13 15:55:02.760864 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Feb 13 15:55:02.760870 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Feb 13 15:55:02.760875 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Feb 13 15:55:02.760881 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Feb 13 15:55:02.760894 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Feb 13 15:55:02.760901 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Feb 13 15:55:02.760907 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Feb 13 15:55:02.760912 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Feb 13 15:55:02.760920 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Feb 13 15:55:02.760925 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Feb 13 15:55:02.760931 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Feb 13 15:55:02.760937 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Feb 13 15:55:02.760943 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Feb 13 15:55:02.760949 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Feb 13 15:55:02.760954 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Feb 13 15:55:02.760961 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 15:55:02.760968 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 15:55:02.760974 kernel: random: crng init done Feb 13 15:55:02.760980 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Feb 13 15:55:02.760986 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Feb 13 15:55:02.760992 kernel: printk: log_buf_len min size: 262144 bytes Feb 13 15:55:02.760998 kernel: printk: log_buf_len: 1048576 bytes Feb 13 15:55:02.761004 kernel: printk: early log buf free: 239648(91%) Feb 13 15:55:02.761010 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 15:55:02.761016 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 15:55:02.761023 kernel: Fallback order for Node 0: 0 Feb 13 15:55:02.761029 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Feb 13 15:55:02.761034 kernel: Policy zone: DMA32 Feb 13 15:55:02.761040 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 15:55:02.761046 kernel: Memory: 1934316K/2096628K available (14336K kernel code, 2301K rwdata, 22852K rodata, 43476K init, 1596K bss, 162052K reserved, 0K cma-reserved) Feb 13 15:55:02.761054 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Feb 13 15:55:02.761061 kernel: ftrace: allocating 37893 entries in 149 pages Feb 13 15:55:02.761066 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 15:55:02.761073 kernel: Dynamic Preempt: voluntary Feb 13 15:55:02.761079 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 15:55:02.761086 kernel: rcu: RCU event tracing is enabled. Feb 13 15:55:02.761092 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Feb 13 15:55:02.761098 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 15:55:02.761104 kernel: Rude variant of Tasks RCU enabled. Feb 13 15:55:02.761110 kernel: Tracing variant of Tasks RCU enabled. Feb 13 15:55:02.761117 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 15:55:02.761123 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Feb 13 15:55:02.761129 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Feb 13 15:55:02.761134 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Feb 13 15:55:02.761140 kernel: Console: colour VGA+ 80x25 Feb 13 15:55:02.761146 kernel: printk: console [tty0] enabled Feb 13 15:55:02.761152 kernel: printk: console [ttyS0] enabled Feb 13 15:55:02.761158 kernel: ACPI: Core revision 20230628 Feb 13 15:55:02.761164 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Feb 13 15:55:02.761171 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 15:55:02.761189 kernel: x2apic enabled Feb 13 15:55:02.761195 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 15:55:02.761201 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 15:55:02.761217 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 15:55:02.761226 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Feb 13 15:55:02.761234 kernel: Disabled fast string operations Feb 13 15:55:02.761243 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 15:55:02.761249 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 15:55:02.761258 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 15:55:02.761264 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Feb 13 15:55:02.761270 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Feb 13 15:55:02.761276 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Feb 13 15:55:02.761282 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 15:55:02.761288 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 15:55:02.761294 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 15:55:02.761300 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 15:55:02.761306 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 15:55:02.761313 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 15:55:02.761319 kernel: SRBDS: Unknown: Dependent on hypervisor status Feb 13 15:55:02.761325 kernel: GDS: Unknown: Dependent on hypervisor status Feb 13 15:55:02.761331 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 15:55:02.761337 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 15:55:02.761343 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 15:55:02.761349 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 15:55:02.761355 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Feb 13 15:55:02.761360 kernel: Freeing SMP alternatives memory: 32K Feb 13 15:55:02.761368 kernel: pid_max: default: 131072 minimum: 1024 Feb 13 15:55:02.761374 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 15:55:02.761379 kernel: landlock: Up and running. Feb 13 15:55:02.761385 kernel: SELinux: Initializing. Feb 13 15:55:02.761391 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 15:55:02.761397 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 15:55:02.761403 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 15:55:02.761409 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 15:55:02.761415 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 15:55:02.761422 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 15:55:02.761428 kernel: Performance Events: Skylake events, core PMU driver. Feb 13 15:55:02.761434 kernel: core: CPUID marked event: 'cpu cycles' unavailable Feb 13 15:55:02.761440 kernel: core: CPUID marked event: 'instructions' unavailable Feb 13 15:55:02.761446 kernel: core: CPUID marked event: 'bus cycles' unavailable Feb 13 15:55:02.761452 kernel: core: CPUID marked event: 'cache references' unavailable Feb 13 15:55:02.761457 kernel: core: CPUID marked event: 'cache misses' unavailable Feb 13 15:55:02.761464 kernel: core: CPUID marked event: 'branch instructions' unavailable Feb 13 15:55:02.761471 kernel: core: CPUID marked event: 'branch misses' unavailable Feb 13 15:55:02.761477 kernel: ... version: 1 Feb 13 15:55:02.761483 kernel: ... bit width: 48 Feb 13 15:55:02.761489 kernel: ... generic registers: 4 Feb 13 15:55:02.761495 kernel: ... value mask: 0000ffffffffffff Feb 13 15:55:02.761500 kernel: ... max period: 000000007fffffff Feb 13 15:55:02.761506 kernel: ... fixed-purpose events: 0 Feb 13 15:55:02.761512 kernel: ... event mask: 000000000000000f Feb 13 15:55:02.761518 kernel: signal: max sigframe size: 1776 Feb 13 15:55:02.761525 kernel: rcu: Hierarchical SRCU implementation. Feb 13 15:55:02.761531 kernel: rcu: Max phase no-delay instances is 400. Feb 13 15:55:02.761537 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 15:55:02.761543 kernel: smp: Bringing up secondary CPUs ... Feb 13 15:55:02.761549 kernel: smpboot: x86: Booting SMP configuration: Feb 13 15:55:02.761555 kernel: .... node #0, CPUs: #1 Feb 13 15:55:02.761561 kernel: Disabled fast string operations Feb 13 15:55:02.761566 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Feb 13 15:55:02.761572 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Feb 13 15:55:02.761578 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 15:55:02.761585 kernel: smpboot: Max logical packages: 128 Feb 13 15:55:02.761591 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Feb 13 15:55:02.761597 kernel: devtmpfs: initialized Feb 13 15:55:02.761603 kernel: x86/mm: Memory block size: 128MB Feb 13 15:55:02.761609 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Feb 13 15:55:02.761615 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 15:55:02.761621 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Feb 13 15:55:02.761627 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 15:55:02.761632 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 15:55:02.761639 kernel: audit: initializing netlink subsys (disabled) Feb 13 15:55:02.761645 kernel: audit: type=2000 audit(1739462101.069:1): state=initialized audit_enabled=0 res=1 Feb 13 15:55:02.761651 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 15:55:02.761657 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 15:55:02.761663 kernel: cpuidle: using governor menu Feb 13 15:55:02.761669 kernel: Simple Boot Flag at 0x36 set to 0x80 Feb 13 15:55:02.761675 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 15:55:02.761681 kernel: dca service started, version 1.12.1 Feb 13 15:55:02.761687 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Feb 13 15:55:02.761694 kernel: PCI: Using configuration type 1 for base access Feb 13 15:55:02.761700 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 15:55:02.761706 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 15:55:02.761711 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 15:55:02.761717 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 15:55:02.761723 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 15:55:02.761729 kernel: ACPI: Added _OSI(Module Device) Feb 13 15:55:02.761735 kernel: ACPI: Added _OSI(Processor Device) Feb 13 15:55:02.761740 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 15:55:02.761748 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 15:55:02.761754 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 15:55:02.761760 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Feb 13 15:55:02.761766 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 15:55:02.761772 kernel: ACPI: Interpreter enabled Feb 13 15:55:02.761777 kernel: ACPI: PM: (supports S0 S1 S5) Feb 13 15:55:02.761784 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 15:55:02.761789 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 15:55:02.761795 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 15:55:02.761802 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Feb 13 15:55:02.761808 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Feb 13 15:55:02.761898 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 15:55:02.761955 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Feb 13 15:55:02.762006 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Feb 13 15:55:02.762015 kernel: PCI host bridge to bus 0000:00 Feb 13 15:55:02.762067 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 15:55:02.762117 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Feb 13 15:55:02.762164 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 13 15:55:02.764248 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 15:55:02.764304 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Feb 13 15:55:02.764356 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Feb 13 15:55:02.764421 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Feb 13 15:55:02.764487 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Feb 13 15:55:02.764546 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Feb 13 15:55:02.764604 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Feb 13 15:55:02.764657 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Feb 13 15:55:02.764709 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 15:55:02.764761 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 15:55:02.764813 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 15:55:02.764867 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 15:55:02.764923 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Feb 13 15:55:02.764976 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Feb 13 15:55:02.765028 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Feb 13 15:55:02.765084 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Feb 13 15:55:02.765136 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Feb 13 15:55:02.765606 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Feb 13 15:55:02.765671 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Feb 13 15:55:02.765726 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Feb 13 15:55:02.765779 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Feb 13 15:55:02.765830 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Feb 13 15:55:02.765882 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Feb 13 15:55:02.765934 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 15:55:02.765999 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Feb 13 15:55:02.766059 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.766113 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.766171 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.770285 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.770377 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.770447 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.770513 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.770576 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.770640 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.770699 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.770762 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.770820 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.770881 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.770936 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.771020 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.771101 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.771233 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.771290 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.771351 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.771405 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.771469 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.771555 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.771650 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.771722 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.771798 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.771863 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.771928 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.772030 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.772101 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.772159 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.774459 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.774520 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.774580 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.774636 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.774693 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.774750 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.775002 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.775141 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.775332 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.775392 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.775453 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.775508 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.775580 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.775650 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.775712 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.775765 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.775820 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.775874 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.775934 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.775986 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776045 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.776099 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776155 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.776217 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776277 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.776330 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776387 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.776440 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776496 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.776550 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776607 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.776664 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776720 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Feb 13 15:55:02.776774 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.776830 kernel: pci_bus 0000:01: extended config space not accessible Feb 13 15:55:02.776886 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 15:55:02.776943 kernel: pci_bus 0000:02: extended config space not accessible Feb 13 15:55:02.776955 kernel: acpiphp: Slot [32] registered Feb 13 15:55:02.776961 kernel: acpiphp: Slot [33] registered Feb 13 15:55:02.776967 kernel: acpiphp: Slot [34] registered Feb 13 15:55:02.776973 kernel: acpiphp: Slot [35] registered Feb 13 15:55:02.776979 kernel: acpiphp: Slot [36] registered Feb 13 15:55:02.776984 kernel: acpiphp: Slot [37] registered Feb 13 15:55:02.776990 kernel: acpiphp: Slot [38] registered Feb 13 15:55:02.776996 kernel: acpiphp: Slot [39] registered Feb 13 15:55:02.777002 kernel: acpiphp: Slot [40] registered Feb 13 15:55:02.777010 kernel: acpiphp: Slot [41] registered Feb 13 15:55:02.777015 kernel: acpiphp: Slot [42] registered Feb 13 15:55:02.777021 kernel: acpiphp: Slot [43] registered Feb 13 15:55:02.777027 kernel: acpiphp: Slot [44] registered Feb 13 15:55:02.777033 kernel: acpiphp: Slot [45] registered Feb 13 15:55:02.777039 kernel: acpiphp: Slot [46] registered Feb 13 15:55:02.777045 kernel: acpiphp: Slot [47] registered Feb 13 15:55:02.777051 kernel: acpiphp: Slot [48] registered Feb 13 15:55:02.777056 kernel: acpiphp: Slot [49] registered Feb 13 15:55:02.777064 kernel: acpiphp: Slot [50] registered Feb 13 15:55:02.777070 kernel: acpiphp: Slot [51] registered Feb 13 15:55:02.777076 kernel: acpiphp: Slot [52] registered Feb 13 15:55:02.777082 kernel: acpiphp: Slot [53] registered Feb 13 15:55:02.777088 kernel: acpiphp: Slot [54] registered Feb 13 15:55:02.777093 kernel: acpiphp: Slot [55] registered Feb 13 15:55:02.777099 kernel: acpiphp: Slot [56] registered Feb 13 15:55:02.777105 kernel: acpiphp: Slot [57] registered Feb 13 15:55:02.777111 kernel: acpiphp: Slot [58] registered Feb 13 15:55:02.777117 kernel: acpiphp: Slot [59] registered Feb 13 15:55:02.777124 kernel: acpiphp: Slot [60] registered Feb 13 15:55:02.777130 kernel: acpiphp: Slot [61] registered Feb 13 15:55:02.777135 kernel: acpiphp: Slot [62] registered Feb 13 15:55:02.777141 kernel: acpiphp: Slot [63] registered Feb 13 15:55:02.777206 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Feb 13 15:55:02.777261 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 15:55:02.777314 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 15:55:02.777364 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 15:55:02.777420 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Feb 13 15:55:02.777472 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Feb 13 15:55:02.777523 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Feb 13 15:55:02.777574 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Feb 13 15:55:02.777626 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Feb 13 15:55:02.777685 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Feb 13 15:55:02.777739 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Feb 13 15:55:02.777795 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Feb 13 15:55:02.777848 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 15:55:02.777901 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 15:55:02.777954 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 15:55:02.778009 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 15:55:02.778061 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 15:55:02.778113 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 15:55:02.778165 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 15:55:02.778231 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 15:55:02.778285 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 15:55:02.778337 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 15:55:02.778391 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 15:55:02.778443 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 15:55:02.778496 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 15:55:02.778547 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 15:55:02.778604 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 15:55:02.778656 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 15:55:02.778709 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 15:55:02.778763 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 15:55:02.778815 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 15:55:02.778868 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 15:55:02.778925 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 15:55:02.778977 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 15:55:02.779029 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 15:55:02.779084 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 15:55:02.779136 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 15:55:02.779309 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 15:55:02.779368 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 15:55:02.779425 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 15:55:02.779478 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 15:55:02.779536 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Feb 13 15:55:02.779616 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Feb 13 15:55:02.779688 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Feb 13 15:55:02.779757 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Feb 13 15:55:02.779812 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Feb 13 15:55:02.779869 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 15:55:02.779924 kernel: pci 0000:0b:00.0: supports D1 D2 Feb 13 15:55:02.779977 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 15:55:02.780030 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 15:55:02.780083 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 15:55:02.780136 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 15:55:02.780201 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 15:55:02.780259 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 15:55:02.780315 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 15:55:02.780367 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 15:55:02.780418 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 15:55:02.780473 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 15:55:02.780525 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 15:55:02.780577 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 15:55:02.780629 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 15:55:02.780685 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 15:55:02.780738 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 15:55:02.780789 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 15:55:02.780842 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 15:55:02.780894 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 15:55:02.780945 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 15:55:02.780999 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 15:55:02.781067 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 15:55:02.781122 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 15:55:02.781176 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 15:55:02.781243 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 15:55:02.781296 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 15:55:02.781350 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 15:55:02.781402 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 15:55:02.781454 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 15:55:02.781508 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 15:55:02.781564 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 15:55:02.781616 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 15:55:02.781668 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 15:55:02.781724 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 15:55:02.781777 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 15:55:02.781829 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 15:55:02.781881 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 15:55:02.781935 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 15:55:02.781990 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 15:55:02.782042 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 15:55:02.782094 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 15:55:02.782147 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 15:55:02.782207 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 15:55:02.782261 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 15:55:02.782315 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 15:55:02.782368 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 15:55:02.782423 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 15:55:02.782477 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 15:55:02.782529 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 15:55:02.782581 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 15:55:02.782634 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 15:55:02.782688 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 15:55:02.782740 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 15:55:02.782794 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 15:55:02.782850 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 15:55:02.782902 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 15:55:02.782955 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 15:55:02.783006 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 15:55:02.783059 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 15:55:02.783110 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 15:55:02.783164 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 15:55:02.783234 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 15:55:02.783287 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 15:55:02.783339 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 15:55:02.783392 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 15:55:02.783443 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 15:55:02.783495 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 15:55:02.783549 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 15:55:02.783602 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 15:55:02.783657 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 15:55:02.783711 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 15:55:02.783764 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 15:55:02.783815 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 15:55:02.783869 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 15:55:02.783921 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 15:55:02.783974 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 15:55:02.784028 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 15:55:02.784082 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 15:55:02.784134 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 15:55:02.784201 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 15:55:02.784256 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 15:55:02.784308 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 15:55:02.784317 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Feb 13 15:55:02.784323 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Feb 13 15:55:02.784329 kernel: ACPI: PCI: Interrupt link LNKB disabled Feb 13 15:55:02.784335 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 15:55:02.784344 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Feb 13 15:55:02.784350 kernel: iommu: Default domain type: Translated Feb 13 15:55:02.784356 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 15:55:02.784362 kernel: PCI: Using ACPI for IRQ routing Feb 13 15:55:02.784368 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 15:55:02.784374 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Feb 13 15:55:02.784380 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Feb 13 15:55:02.784432 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Feb 13 15:55:02.784484 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Feb 13 15:55:02.784537 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 15:55:02.784547 kernel: vgaarb: loaded Feb 13 15:55:02.784553 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Feb 13 15:55:02.784559 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Feb 13 15:55:02.784565 kernel: clocksource: Switched to clocksource tsc-early Feb 13 15:55:02.784571 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 15:55:02.784578 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 15:55:02.784583 kernel: pnp: PnP ACPI init Feb 13 15:55:02.784636 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Feb 13 15:55:02.784688 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Feb 13 15:55:02.784735 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Feb 13 15:55:02.784786 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Feb 13 15:55:02.784837 kernel: pnp 00:06: [dma 2] Feb 13 15:55:02.784892 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Feb 13 15:55:02.784941 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Feb 13 15:55:02.784991 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Feb 13 15:55:02.785000 kernel: pnp: PnP ACPI: found 8 devices Feb 13 15:55:02.785006 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 15:55:02.785012 kernel: NET: Registered PF_INET protocol family Feb 13 15:55:02.785018 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 15:55:02.785024 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 15:55:02.785030 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 15:55:02.785036 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 15:55:02.785044 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 15:55:02.785050 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 15:55:02.785056 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 15:55:02.785062 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 15:55:02.785068 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 15:55:02.785073 kernel: NET: Registered PF_XDP protocol family Feb 13 15:55:02.785127 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Feb 13 15:55:02.785278 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Feb 13 15:55:02.785342 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Feb 13 15:55:02.785396 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Feb 13 15:55:02.785451 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Feb 13 15:55:02.785504 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Feb 13 15:55:02.785558 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Feb 13 15:55:02.785610 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Feb 13 15:55:02.785667 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Feb 13 15:55:02.785719 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Feb 13 15:55:02.785771 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Feb 13 15:55:02.785824 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Feb 13 15:55:02.785877 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Feb 13 15:55:02.785928 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Feb 13 15:55:02.785985 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Feb 13 15:55:02.786039 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Feb 13 15:55:02.786092 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Feb 13 15:55:02.786145 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Feb 13 15:55:02.786291 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Feb 13 15:55:02.786349 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Feb 13 15:55:02.786402 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Feb 13 15:55:02.786455 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Feb 13 15:55:02.786507 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Feb 13 15:55:02.786559 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 15:55:02.786611 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 15:55:02.786662 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.786717 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.786769 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.786821 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.786874 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.786926 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.786978 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787031 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787083 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787138 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787203 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787264 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787317 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787370 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787423 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787475 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787528 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787583 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787636 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787689 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787741 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787794 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787846 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.787899 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.787951 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788006 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788059 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788112 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788164 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788254 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788308 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788360 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788413 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788468 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788519 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788572 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788623 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788675 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788726 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788779 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788831 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788885 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.788938 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.788991 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789042 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789094 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789146 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789211 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789266 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789319 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789376 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789429 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789481 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789533 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789586 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789638 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789690 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789742 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789795 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789847 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.789902 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.789955 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790007 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790060 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790111 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790163 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790230 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790285 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790337 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790394 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790446 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790498 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790550 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790603 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790656 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790708 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790760 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790812 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790863 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.790918 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.790971 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.791022 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.791075 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 15:55:02.791127 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 15:55:02.791388 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 15:55:02.791447 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Feb 13 15:55:02.791500 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 15:55:02.791552 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 15:55:02.791606 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 15:55:02.791663 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Feb 13 15:55:02.791715 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 15:55:02.791766 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 15:55:02.791818 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 15:55:02.791870 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 15:55:02.791922 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 15:55:02.791975 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 15:55:02.792029 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 15:55:02.792081 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 15:55:02.792133 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 15:55:02.792193 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 15:55:02.792247 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 15:55:02.792301 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 15:55:02.792353 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 15:55:02.792405 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 15:55:02.792683 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 15:55:02.794576 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 15:55:02.794647 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 15:55:02.794705 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 15:55:02.794763 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 15:55:02.794817 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 15:55:02.794870 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 15:55:02.794924 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 15:55:02.794979 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 15:55:02.795031 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 15:55:02.795084 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 15:55:02.795137 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 15:55:02.795257 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 15:55:02.795317 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Feb 13 15:55:02.795371 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 15:55:02.795423 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 15:55:02.795475 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 15:55:02.795530 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 15:55:02.795584 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 15:55:02.795636 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 15:55:02.795688 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 15:55:02.795750 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 15:55:02.795805 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 15:55:02.795857 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 15:55:02.795908 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 15:55:02.795960 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 15:55:02.796015 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 15:55:02.796069 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 15:55:02.796121 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 15:55:02.796174 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 15:55:02.796241 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 15:55:02.796294 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 15:55:02.796346 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 15:55:02.796399 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 15:55:02.796452 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 15:55:02.796504 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 15:55:02.796560 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 15:55:02.796612 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 15:55:02.796665 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 15:55:02.796718 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 15:55:02.796771 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 15:55:02.796824 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 15:55:02.796877 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 15:55:02.796930 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 15:55:02.796983 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 15:55:02.797040 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 15:55:02.797092 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 15:55:02.797143 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 15:55:02.797434 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 15:55:02.797495 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 15:55:02.797549 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 15:55:02.797603 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 15:55:02.797655 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 15:55:02.797709 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 15:55:02.797761 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 15:55:02.797816 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 15:55:02.797869 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 15:55:02.797921 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 15:55:02.797974 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 15:55:02.798026 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 15:55:02.798079 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 15:55:02.798132 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 15:55:02.798254 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 15:55:02.798311 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 15:55:02.798368 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 15:55:02.798421 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 15:55:02.798472 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 15:55:02.798523 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 15:55:02.798575 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 15:55:02.798627 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 15:55:02.798678 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 15:55:02.798729 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 15:55:02.798782 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 15:55:02.798837 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 15:55:02.798889 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 15:55:02.798941 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 15:55:02.798993 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 15:55:02.799045 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 15:55:02.799097 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 15:55:02.799150 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 15:55:02.799210 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 15:55:02.799263 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 15:55:02.799316 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 15:55:02.799374 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 15:55:02.799427 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 15:55:02.799480 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 15:55:02.799533 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 15:55:02.799586 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 15:55:02.799638 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 15:55:02.799690 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 15:55:02.799742 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 15:55:02.799794 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 15:55:02.799850 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 15:55:02.799902 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 15:55:02.799953 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 15:55:02.800000 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 15:55:02.800047 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 15:55:02.800092 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Feb 13 15:55:02.800138 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Feb 13 15:55:02.800347 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Feb 13 15:55:02.800403 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Feb 13 15:55:02.800451 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 15:55:02.800499 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 15:55:02.800546 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 15:55:02.800594 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 15:55:02.800643 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Feb 13 15:55:02.800704 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Feb 13 15:55:02.800763 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Feb 13 15:55:02.800816 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Feb 13 15:55:02.800864 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 15:55:02.800916 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Feb 13 15:55:02.800965 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Feb 13 15:55:02.801012 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 15:55:02.801063 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Feb 13 15:55:02.801114 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Feb 13 15:55:02.801162 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 15:55:02.801233 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Feb 13 15:55:02.801284 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 15:55:02.801336 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Feb 13 15:55:02.801385 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 15:55:02.801437 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Feb 13 15:55:02.801696 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 15:55:02.801751 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Feb 13 15:55:02.803795 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 15:55:02.803859 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Feb 13 15:55:02.803918 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 15:55:02.803974 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Feb 13 15:55:02.804023 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Feb 13 15:55:02.804073 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 15:55:02.804124 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Feb 13 15:55:02.804171 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Feb 13 15:55:02.804275 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 15:55:02.804330 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Feb 13 15:55:02.804382 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Feb 13 15:55:02.804430 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 15:55:02.804482 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Feb 13 15:55:02.804531 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 15:55:02.804584 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Feb 13 15:55:02.804633 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 15:55:02.804685 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Feb 13 15:55:02.804737 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 15:55:02.804789 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Feb 13 15:55:02.804838 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 15:55:02.804893 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Feb 13 15:55:02.804943 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 15:55:02.804994 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Feb 13 15:55:02.805046 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Feb 13 15:55:02.805094 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 15:55:02.805147 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Feb 13 15:55:02.805214 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Feb 13 15:55:02.805279 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 15:55:02.805330 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Feb 13 15:55:02.805382 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Feb 13 15:55:02.805429 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 15:55:02.805481 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Feb 13 15:55:02.805529 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 15:55:02.805580 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Feb 13 15:55:02.805628 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 15:55:02.805678 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Feb 13 15:55:02.805728 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 15:55:02.805779 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Feb 13 15:55:02.805827 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 15:55:02.805878 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Feb 13 15:55:02.805926 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 15:55:02.805982 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Feb 13 15:55:02.806031 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Feb 13 15:55:02.806078 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 15:55:02.806129 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Feb 13 15:55:02.806213 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Feb 13 15:55:02.806270 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 15:55:02.806322 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Feb 13 15:55:02.806373 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 15:55:02.806424 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Feb 13 15:55:02.806472 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 15:55:02.806522 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Feb 13 15:55:02.806570 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 15:55:02.806621 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Feb 13 15:55:02.806671 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 15:55:02.806722 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Feb 13 15:55:02.806770 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 15:55:02.806820 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Feb 13 15:55:02.806868 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 15:55:02.806924 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 15:55:02.806934 kernel: PCI: CLS 32 bytes, default 64 Feb 13 15:55:02.806943 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 15:55:02.806949 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 15:55:02.806956 kernel: clocksource: Switched to clocksource tsc Feb 13 15:55:02.806962 kernel: Initialise system trusted keyrings Feb 13 15:55:02.806969 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 15:55:02.806976 kernel: Key type asymmetric registered Feb 13 15:55:02.806982 kernel: Asymmetric key parser 'x509' registered Feb 13 15:55:02.806989 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 15:55:02.806995 kernel: io scheduler mq-deadline registered Feb 13 15:55:02.807003 kernel: io scheduler kyber registered Feb 13 15:55:02.807009 kernel: io scheduler bfq registered Feb 13 15:55:02.807062 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Feb 13 15:55:02.807117 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.807171 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Feb 13 15:55:02.807287 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.807341 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Feb 13 15:55:02.807395 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.807451 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Feb 13 15:55:02.807504 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.807557 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Feb 13 15:55:02.807609 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.807662 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Feb 13 15:55:02.807718 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.807772 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Feb 13 15:55:02.807842 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.807910 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Feb 13 15:55:02.807962 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808017 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Feb 13 15:55:02.808070 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808122 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Feb 13 15:55:02.808174 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808242 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Feb 13 15:55:02.808294 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808346 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Feb 13 15:55:02.808401 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808454 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Feb 13 15:55:02.808506 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808558 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Feb 13 15:55:02.808610 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808663 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Feb 13 15:55:02.808718 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808770 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Feb 13 15:55:02.808823 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808876 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Feb 13 15:55:02.808928 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.808981 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Feb 13 15:55:02.809037 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809090 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Feb 13 15:55:02.809142 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809273 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Feb 13 15:55:02.809326 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809378 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Feb 13 15:55:02.809433 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809485 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Feb 13 15:55:02.809537 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809588 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Feb 13 15:55:02.809641 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809695 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Feb 13 15:55:02.809747 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809800 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Feb 13 15:55:02.809852 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.809903 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Feb 13 15:55:02.809954 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.810006 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Feb 13 15:55:02.810061 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.810113 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Feb 13 15:55:02.810165 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.810256 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Feb 13 15:55:02.810309 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.810366 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Feb 13 15:55:02.810418 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.810469 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Feb 13 15:55:02.810521 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.810573 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Feb 13 15:55:02.810627 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 15:55:02.810636 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 15:55:02.810643 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 15:55:02.810649 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 15:55:02.810656 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Feb 13 15:55:02.810662 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 15:55:02.810668 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 15:55:02.810722 kernel: rtc_cmos 00:01: registered as rtc0 Feb 13 15:55:02.810772 kernel: rtc_cmos 00:01: setting system clock to 2025-02-13T15:55:02 UTC (1739462102) Feb 13 15:55:02.810819 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Feb 13 15:55:02.810828 kernel: intel_pstate: CPU model not supported Feb 13 15:55:02.810835 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 Feb 13 15:55:02.810841 kernel: NET: Registered PF_INET6 protocol family Feb 13 15:55:02.810847 kernel: Segment Routing with IPv6 Feb 13 15:55:02.810853 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 15:55:02.810859 kernel: NET: Registered PF_PACKET protocol family Feb 13 15:55:02.810867 kernel: Key type dns_resolver registered Feb 13 15:55:02.810873 kernel: IPI shorthand broadcast: enabled Feb 13 15:55:02.810880 kernel: sched_clock: Marking stable (904003389, 244036649)->(1208800893, -60760855) Feb 13 15:55:02.810886 kernel: registered taskstats version 1 Feb 13 15:55:02.810892 kernel: Loading compiled-in X.509 certificates Feb 13 15:55:02.810898 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: a260c8876205efb4ca2ab3eb040cd310ec7afd21' Feb 13 15:55:02.810904 kernel: Key type .fscrypt registered Feb 13 15:55:02.810910 kernel: Key type fscrypt-provisioning registered Feb 13 15:55:02.810916 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 15:55:02.810923 kernel: ima: Allocated hash algorithm: sha1 Feb 13 15:55:02.810930 kernel: ima: No architecture policies found Feb 13 15:55:02.810936 kernel: clk: Disabling unused clocks Feb 13 15:55:02.810942 kernel: Freeing unused kernel image (initmem) memory: 43476K Feb 13 15:55:02.810948 kernel: Write protecting the kernel read-only data: 38912k Feb 13 15:55:02.810954 kernel: Freeing unused kernel image (rodata/data gap) memory: 1724K Feb 13 15:55:02.810960 kernel: Run /init as init process Feb 13 15:55:02.810966 kernel: with arguments: Feb 13 15:55:02.810973 kernel: /init Feb 13 15:55:02.810980 kernel: with environment: Feb 13 15:55:02.810986 kernel: HOME=/ Feb 13 15:55:02.810993 kernel: TERM=linux Feb 13 15:55:02.810998 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 15:55:02.811006 systemd[1]: Successfully made /usr/ read-only. Feb 13 15:55:02.811014 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 15:55:02.811021 systemd[1]: Detected virtualization vmware. Feb 13 15:55:02.811027 systemd[1]: Detected architecture x86-64. Feb 13 15:55:02.811035 systemd[1]: Running in initrd. Feb 13 15:55:02.811041 systemd[1]: No hostname configured, using default hostname. Feb 13 15:55:02.811048 systemd[1]: Hostname set to . Feb 13 15:55:02.811054 systemd[1]: Initializing machine ID from random generator. Feb 13 15:55:02.811060 systemd[1]: Queued start job for default target initrd.target. Feb 13 15:55:02.811066 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:55:02.811073 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:55:02.811080 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 15:55:02.811088 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 15:55:02.811094 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 15:55:02.811101 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 15:55:02.811108 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 15:55:02.811114 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 15:55:02.811121 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:55:02.811127 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:55:02.811135 systemd[1]: Reached target paths.target - Path Units. Feb 13 15:55:02.811141 systemd[1]: Reached target slices.target - Slice Units. Feb 13 15:55:02.811148 systemd[1]: Reached target swap.target - Swaps. Feb 13 15:55:02.811154 systemd[1]: Reached target timers.target - Timer Units. Feb 13 15:55:02.811161 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 15:55:02.811167 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 15:55:02.811173 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 15:55:02.811246 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Feb 13 15:55:02.811254 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:55:02.811261 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 15:55:02.811267 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:55:02.811273 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 15:55:02.811279 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 15:55:02.811285 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 15:55:02.811292 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 15:55:02.811298 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 15:55:02.811304 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 15:55:02.811311 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 15:55:02.811318 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:55:02.811324 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 15:55:02.811331 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:55:02.811352 systemd-journald[217]: Collecting audit messages is disabled. Feb 13 15:55:02.811369 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 15:55:02.811376 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 15:55:02.811382 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 15:55:02.811390 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 15:55:02.811397 kernel: Bridge firewalling registered Feb 13 15:55:02.811403 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 15:55:02.811426 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 15:55:02.811432 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 15:55:02.811439 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:55:02.811445 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:55:02.811451 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:55:02.811458 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:55:02.811467 systemd-journald[217]: Journal started Feb 13 15:55:02.811495 systemd-journald[217]: Runtime Journal (/run/log/journal/048321167bcb4b2ab9efb9fa325a7491) is 4.8M, max 38.6M, 33.8M free. Feb 13 15:55:02.754621 systemd-modules-load[218]: Inserted module 'overlay' Feb 13 15:55:02.779723 systemd-modules-load[218]: Inserted module 'br_netfilter' Feb 13 15:55:02.814245 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 15:55:02.819264 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 15:55:02.819505 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:55:02.820384 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 15:55:02.823633 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:55:02.824718 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 15:55:02.829165 dracut-cmdline[250]: dracut-dracut-053 Feb 13 15:55:02.830474 dracut-cmdline[250]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=f6a3351ed39d61c0cb6d1964ad84b777665fb0b2f253a15f9696d9c5fba26f65 Feb 13 15:55:02.847416 systemd-resolved[254]: Positive Trust Anchors: Feb 13 15:55:02.847423 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 15:55:02.847445 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 15:55:02.848987 systemd-resolved[254]: Defaulting to hostname 'linux'. Feb 13 15:55:02.849650 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 15:55:02.849948 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:55:02.878636 kernel: SCSI subsystem initialized Feb 13 15:55:02.882190 kernel: Loading iSCSI transport class v2.0-870. Feb 13 15:55:02.888197 kernel: iscsi: registered transport (tcp) Feb 13 15:55:02.901211 kernel: iscsi: registered transport (qla4xxx) Feb 13 15:55:02.901228 kernel: QLogic iSCSI HBA Driver Feb 13 15:55:02.920740 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 15:55:02.925296 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 15:55:02.940012 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 15:55:02.940032 kernel: device-mapper: uevent: version 1.0.3 Feb 13 15:55:02.940040 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 15:55:02.970220 kernel: raid6: avx2x4 gen() 48001 MB/s Feb 13 15:55:02.987188 kernel: raid6: avx2x2 gen() 54521 MB/s Feb 13 15:55:03.004381 kernel: raid6: avx2x1 gen() 45864 MB/s Feb 13 15:55:03.004398 kernel: raid6: using algorithm avx2x2 gen() 54521 MB/s Feb 13 15:55:03.022366 kernel: raid6: .... xor() 33224 MB/s, rmw enabled Feb 13 15:55:03.022394 kernel: raid6: using avx2x2 recovery algorithm Feb 13 15:55:03.035191 kernel: xor: automatically using best checksumming function avx Feb 13 15:55:03.122198 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 15:55:03.127282 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 15:55:03.133450 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:55:03.141366 systemd-udevd[436]: Using default interface naming scheme 'v255'. Feb 13 15:55:03.144225 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:55:03.147274 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 15:55:03.155372 dracut-pre-trigger[437]: rd.md=0: removing MD RAID activation Feb 13 15:55:03.169597 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 15:55:03.174246 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 15:55:03.242566 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:55:03.250304 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 15:55:03.260332 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 15:55:03.260732 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 15:55:03.260985 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:55:03.261084 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 15:55:03.263254 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 15:55:03.272492 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 15:55:03.315228 kernel: VMware PVSCSI driver - version 1.0.7.0-k Feb 13 15:55:03.324806 kernel: vmw_pvscsi: using 64bit dma Feb 13 15:55:03.324829 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Feb 13 15:55:03.324841 kernel: vmw_pvscsi: max_id: 16 Feb 13 15:55:03.324848 kernel: vmw_pvscsi: setting ring_pages to 8 Feb 13 15:55:03.328197 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Feb 13 15:55:03.339204 kernel: vmw_pvscsi: enabling reqCallThreshold Feb 13 15:55:03.339215 kernel: vmw_pvscsi: driver-based request coalescing enabled Feb 13 15:55:03.339223 kernel: vmw_pvscsi: using MSI-X Feb 13 15:55:03.339230 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Feb 13 15:55:03.339248 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Feb 13 15:55:03.339327 kernel: libata version 3.00 loaded. Feb 13 15:55:03.343192 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Feb 13 15:55:03.346846 kernel: ata_piix 0000:00:07.1: version 2.13 Feb 13 15:55:03.355138 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Feb 13 15:55:03.355242 kernel: scsi host1: ata_piix Feb 13 15:55:03.355306 kernel: scsi host2: ata_piix Feb 13 15:55:03.355365 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Feb 13 15:55:03.355376 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Feb 13 15:55:03.355384 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 15:55:03.355392 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Feb 13 15:55:03.357543 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 15:55:03.357698 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:55:03.358038 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:55:03.358325 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:55:03.358351 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:55:03.358730 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:55:03.363286 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:55:03.373217 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:55:03.386406 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 15:55:03.396819 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:55:03.518265 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Feb 13 15:55:03.525223 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Feb 13 15:55:03.540224 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 15:55:03.542562 kernel: AES CTR mode by8 optimization enabled Feb 13 15:55:03.546479 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Feb 13 15:55:03.582719 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 15:55:03.582972 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Feb 13 15:55:03.583045 kernel: sd 0:0:0:0: [sda] Cache data unavailable Feb 13 15:55:03.583115 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Feb 13 15:55:03.583195 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Feb 13 15:55:03.583276 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 13 15:55:03.583286 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Feb 13 15:55:03.583353 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 15:55:03.583362 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 15:55:03.606528 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (486) Feb 13 15:55:03.615236 kernel: BTRFS: device fsid 506754f7-5ef1-4c63-ad2a-b7b855a48f85 devid 1 transid 40 /dev/sda3 scanned by (udev-worker) (499) Feb 13 15:55:03.618606 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Feb 13 15:55:03.624140 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 15:55:03.632164 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Feb 13 15:55:03.632338 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Feb 13 15:55:03.637619 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Feb 13 15:55:03.648292 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 15:55:03.674246 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 15:55:04.684201 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 15:55:04.685225 disk-uuid[595]: The operation has completed successfully. Feb 13 15:55:04.726832 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 15:55:04.727105 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 15:55:04.745440 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 15:55:04.747494 sh[611]: Success Feb 13 15:55:04.757246 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 15:55:04.797224 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 15:55:04.802402 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 15:55:04.803572 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 15:55:04.835530 kernel: BTRFS info (device dm-0): first mount of filesystem 506754f7-5ef1-4c63-ad2a-b7b855a48f85 Feb 13 15:55:04.835565 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:55:04.835575 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 15:55:04.836608 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 15:55:04.838187 kernel: BTRFS info (device dm-0): using free space tree Feb 13 15:55:04.844188 kernel: BTRFS info (device dm-0): enabling ssd optimizations Feb 13 15:55:04.846033 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 15:55:04.853390 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Feb 13 15:55:04.855257 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 15:55:04.876128 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:55:04.876157 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:55:04.876170 kernel: BTRFS info (device sda6): using free space tree Feb 13 15:55:04.880189 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 15:55:04.895133 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 15:55:04.897213 kernel: BTRFS info (device sda6): last unmount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:55:04.901279 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 15:55:04.906279 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 15:55:04.916659 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 15:55:04.923399 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 15:55:04.977853 ignition[672]: Ignition 2.20.0 Feb 13 15:55:04.977860 ignition[672]: Stage: fetch-offline Feb 13 15:55:04.977878 ignition[672]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:55:04.977883 ignition[672]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:55:04.977929 ignition[672]: parsed url from cmdline: "" Feb 13 15:55:04.977931 ignition[672]: no config URL provided Feb 13 15:55:04.977933 ignition[672]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 15:55:04.977937 ignition[672]: no config at "/usr/lib/ignition/user.ign" Feb 13 15:55:04.978382 ignition[672]: config successfully fetched Feb 13 15:55:04.978399 ignition[672]: parsing config with SHA512: a3d3af7dcd252210fa42a7ecf07aa2bf9a9e58ba261ab2be7e3d760dcdebc86385fbca151dcde7663445de1dd3ae0c4c630ef60296099a1335de08f3fdf4b792 Feb 13 15:55:04.981059 unknown[672]: fetched base config from "system" Feb 13 15:55:04.981065 unknown[672]: fetched user config from "vmware" Feb 13 15:55:04.981325 ignition[672]: fetch-offline: fetch-offline passed Feb 13 15:55:04.982017 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 15:55:04.981370 ignition[672]: Ignition finished successfully Feb 13 15:55:04.989752 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 15:55:04.995383 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 15:55:05.010715 systemd-networkd[806]: lo: Link UP Feb 13 15:55:05.010721 systemd-networkd[806]: lo: Gained carrier Feb 13 15:55:05.011490 systemd-networkd[806]: Enumeration completed Feb 13 15:55:05.011727 systemd-networkd[806]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Feb 13 15:55:05.011899 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 15:55:05.015488 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 15:55:05.015593 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 15:55:05.012044 systemd[1]: Reached target network.target - Network. Feb 13 15:55:05.012136 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 15:55:05.014848 systemd-networkd[806]: ens192: Link UP Feb 13 15:55:05.014850 systemd-networkd[806]: ens192: Gained carrier Feb 13 15:55:05.023352 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 15:55:05.032584 ignition[810]: Ignition 2.20.0 Feb 13 15:55:05.032591 ignition[810]: Stage: kargs Feb 13 15:55:05.032696 ignition[810]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:55:05.032702 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:55:05.033268 ignition[810]: kargs: kargs passed Feb 13 15:55:05.033306 ignition[810]: Ignition finished successfully Feb 13 15:55:05.034536 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 15:55:05.038388 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 15:55:05.044577 ignition[818]: Ignition 2.20.0 Feb 13 15:55:05.044586 ignition[818]: Stage: disks Feb 13 15:55:05.044689 ignition[818]: no configs at "/usr/lib/ignition/base.d" Feb 13 15:55:05.044695 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:55:05.045223 ignition[818]: disks: disks passed Feb 13 15:55:05.045250 ignition[818]: Ignition finished successfully Feb 13 15:55:05.045860 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 15:55:05.046361 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 15:55:05.046606 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 15:55:05.046828 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 15:55:05.046921 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 15:55:05.047014 systemd[1]: Reached target basic.target - Basic System. Feb 13 15:55:05.051379 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 15:55:05.079364 systemd-fsck[827]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Feb 13 15:55:05.080298 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 15:55:05.842252 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 15:55:05.907206 kernel: EXT4-fs (sda9): mounted filesystem 8023eced-1511-4e72-a58a-db1b8cb3210e r/w with ordered data mode. Quota mode: none. Feb 13 15:55:05.907238 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 15:55:05.907578 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 15:55:05.915359 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 15:55:05.916625 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 15:55:05.916893 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 15:55:05.916918 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 15:55:05.916932 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 15:55:05.919549 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 15:55:05.920672 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 15:55:05.926753 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (835) Feb 13 15:55:05.926776 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:55:05.926785 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:55:05.926793 kernel: BTRFS info (device sda6): using free space tree Feb 13 15:55:05.934189 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 15:55:05.935427 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 15:55:05.951459 initrd-setup-root[859]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 15:55:05.953905 initrd-setup-root[866]: cut: /sysroot/etc/group: No such file or directory Feb 13 15:55:05.956167 initrd-setup-root[873]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 15:55:05.958629 initrd-setup-root[880]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 15:55:06.015019 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 15:55:06.019392 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 15:55:06.021769 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 15:55:06.024189 kernel: BTRFS info (device sda6): last unmount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:55:06.046334 ignition[947]: INFO : Ignition 2.20.0 Feb 13 15:55:06.046334 ignition[947]: INFO : Stage: mount Feb 13 15:55:06.046689 ignition[947]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:55:06.046689 ignition[947]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:55:06.046968 ignition[947]: INFO : mount: mount passed Feb 13 15:55:06.047087 ignition[947]: INFO : Ignition finished successfully Feb 13 15:55:06.047683 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 15:55:06.052230 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 15:55:06.061001 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 15:55:06.151551 systemd-networkd[806]: ens192: Gained IPv6LL Feb 13 15:55:06.834278 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 15:55:06.839376 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 15:55:06.869195 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (959) Feb 13 15:55:06.872853 kernel: BTRFS info (device sda6): first mount of filesystem 666795ea-1390-4b1f-8cde-ea877eeb5773 Feb 13 15:55:06.872869 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 15:55:06.872878 kernel: BTRFS info (device sda6): using free space tree Feb 13 15:55:06.876189 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 15:55:06.877191 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 15:55:06.894390 ignition[976]: INFO : Ignition 2.20.0 Feb 13 15:55:06.894390 ignition[976]: INFO : Stage: files Feb 13 15:55:06.894738 ignition[976]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:55:06.894738 ignition[976]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:55:06.894998 ignition[976]: DEBUG : files: compiled without relabeling support, skipping Feb 13 15:55:06.895681 ignition[976]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 15:55:06.895681 ignition[976]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 15:55:06.896881 ignition[976]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 15:55:06.897008 ignition[976]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 15:55:06.897127 ignition[976]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 15:55:06.897101 unknown[976]: wrote ssh authorized keys file for user: core Feb 13 15:55:06.898833 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 15:55:06.899049 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 15:55:06.962724 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 15:55:07.451843 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 15:55:07.452103 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Feb 13 15:55:07.452103 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 15:55:07.452103 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 15:55:07.452103 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 15:55:07.452103 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:55:07.452879 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Feb 13 15:55:07.918253 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Feb 13 15:55:08.132334 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 15:55:08.132589 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 15:55:08.132589 ignition[976]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 15:55:08.132589 ignition[976]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Feb 13 15:55:08.132589 ignition[976]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 15:55:08.133138 ignition[976]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 15:55:08.133138 ignition[976]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Feb 13 15:55:08.133138 ignition[976]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Feb 13 15:55:08.133138 ignition[976]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 15:55:08.133138 ignition[976]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 15:55:08.133138 ignition[976]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Feb 13 15:55:08.133138 ignition[976]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Feb 13 15:55:08.154901 ignition[976]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 15:55:08.157059 ignition[976]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 15:55:08.157259 ignition[976]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Feb 13 15:55:08.157259 ignition[976]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Feb 13 15:55:08.157259 ignition[976]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 15:55:08.158088 ignition[976]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 15:55:08.158088 ignition[976]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 15:55:08.158088 ignition[976]: INFO : files: files passed Feb 13 15:55:08.158088 ignition[976]: INFO : Ignition finished successfully Feb 13 15:55:08.158318 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 15:55:08.163302 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 15:55:08.165291 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 15:55:08.165586 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 15:55:08.165636 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 15:55:08.172599 initrd-setup-root-after-ignition[1007]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:55:08.172599 initrd-setup-root-after-ignition[1007]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:55:08.172951 initrd-setup-root-after-ignition[1011]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 15:55:08.173886 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 15:55:08.174267 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 15:55:08.178289 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 15:55:08.189351 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 15:55:08.189398 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 15:55:08.189706 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 15:55:08.189810 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 15:55:08.189930 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 15:55:08.191258 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 15:55:08.199821 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 15:55:08.205294 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 15:55:08.210543 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:55:08.210812 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:55:08.210976 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 15:55:08.211119 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 15:55:08.211199 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 15:55:08.211416 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 15:55:08.211554 systemd[1]: Stopped target basic.target - Basic System. Feb 13 15:55:08.211685 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 15:55:08.211824 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 15:55:08.211970 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 15:55:08.212113 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 15:55:08.213547 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 15:55:08.213709 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 15:55:08.213854 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 15:55:08.214112 systemd[1]: Stopped target swap.target - Swaps. Feb 13 15:55:08.214592 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 15:55:08.214664 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 15:55:08.214882 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:55:08.215129 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:55:08.215532 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 15:55:08.215577 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:55:08.215848 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 15:55:08.215915 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 15:55:08.216503 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 15:55:08.216581 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 15:55:08.216888 systemd[1]: Stopped target paths.target - Path Units. Feb 13 15:55:08.217332 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 15:55:08.221196 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:55:08.221360 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 15:55:08.221492 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 15:55:08.221618 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 15:55:08.221673 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 15:55:08.221813 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 15:55:08.221861 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 15:55:08.222014 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 15:55:08.222079 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 15:55:08.222253 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 15:55:08.222321 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 15:55:08.229390 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 15:55:08.231355 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 15:55:08.231605 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 15:55:08.231783 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:55:08.232188 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 15:55:08.232428 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 15:55:08.237582 ignition[1031]: INFO : Ignition 2.20.0 Feb 13 15:55:08.237582 ignition[1031]: INFO : Stage: umount Feb 13 15:55:08.242325 ignition[1031]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 15:55:08.242325 ignition[1031]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 15:55:08.242325 ignition[1031]: INFO : umount: umount passed Feb 13 15:55:08.242325 ignition[1031]: INFO : Ignition finished successfully Feb 13 15:55:08.240505 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 15:55:08.240694 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 15:55:08.240942 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 15:55:08.240986 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 15:55:08.241465 systemd[1]: Stopped target network.target - Network. Feb 13 15:55:08.241570 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 15:55:08.241609 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 15:55:08.241735 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 15:55:08.241758 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 15:55:08.241867 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 15:55:08.241891 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 15:55:08.241992 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 15:55:08.242013 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 15:55:08.243593 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 15:55:08.243886 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 15:55:08.248387 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 15:55:08.252708 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 15:55:08.252766 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 15:55:08.253721 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Feb 13 15:55:08.253835 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 15:55:08.253891 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 15:55:08.255367 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Feb 13 15:55:08.255778 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 15:55:08.255803 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:55:08.260340 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 15:55:08.260435 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 15:55:08.260464 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 15:55:08.260592 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Feb 13 15:55:08.260614 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 15:55:08.260731 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 15:55:08.260753 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:55:08.261838 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 15:55:08.261864 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 15:55:08.261976 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 15:55:08.261998 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:55:08.262156 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:55:08.263552 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Feb 13 15:55:08.263587 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Feb 13 15:55:08.268035 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 15:55:08.268095 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 15:55:08.275620 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 15:55:08.275706 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:55:08.275997 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 15:55:08.276025 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 15:55:08.276250 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 15:55:08.276268 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:55:08.276421 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 15:55:08.276447 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 15:55:08.276699 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 15:55:08.276722 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 15:55:08.277078 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 15:55:08.277101 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 15:55:08.282437 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 15:55:08.282547 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 15:55:08.282574 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:55:08.282746 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Feb 13 15:55:08.282769 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 15:55:08.282888 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 15:55:08.282911 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:55:08.283027 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 15:55:08.283048 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:55:08.284165 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Feb 13 15:55:08.284213 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Feb 13 15:55:08.285299 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 15:55:08.285355 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 15:55:08.356271 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 15:55:08.356353 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 15:55:08.356812 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 15:55:08.356948 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 15:55:08.356985 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 15:55:08.361303 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 15:55:08.370768 systemd[1]: Switching root. Feb 13 15:55:08.406403 systemd-journald[217]: Journal stopped Feb 13 15:55:09.787028 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Feb 13 15:55:09.787052 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 15:55:09.787061 kernel: SELinux: policy capability open_perms=1 Feb 13 15:55:09.787066 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 15:55:09.787072 kernel: SELinux: policy capability always_check_network=0 Feb 13 15:55:09.787077 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 15:55:09.787084 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 15:55:09.787090 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 15:55:09.787096 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 15:55:09.787101 kernel: audit: type=1403 audit(1739462109.070:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 15:55:09.787107 systemd[1]: Successfully loaded SELinux policy in 30.449ms. Feb 13 15:55:09.787114 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.181ms. Feb 13 15:55:09.787121 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Feb 13 15:55:09.787129 systemd[1]: Detected virtualization vmware. Feb 13 15:55:09.787135 systemd[1]: Detected architecture x86-64. Feb 13 15:55:09.787142 systemd[1]: Detected first boot. Feb 13 15:55:09.787148 systemd[1]: Initializing machine ID from random generator. Feb 13 15:55:09.787156 zram_generator::config[1075]: No configuration found. Feb 13 15:55:09.787279 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Feb 13 15:55:09.787290 kernel: Guest personality initialized and is active Feb 13 15:55:09.787296 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Feb 13 15:55:09.787302 kernel: Initialized host personality Feb 13 15:55:09.787307 kernel: NET: Registered PF_VSOCK protocol family Feb 13 15:55:09.787314 systemd[1]: Populated /etc with preset unit settings. Feb 13 15:55:09.787323 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:55:09.787331 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Feb 13 15:55:09.787337 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Feb 13 15:55:09.787343 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 15:55:09.787349 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 15:55:09.787356 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 15:55:09.787364 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 15:55:09.787371 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 15:55:09.787377 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 15:55:09.787384 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 15:55:09.787390 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 15:55:09.787397 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 15:55:09.787403 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 15:55:09.787410 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 15:55:09.787419 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 15:55:09.787425 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 15:55:09.787434 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 15:55:09.787440 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 15:55:09.787447 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 15:55:09.787454 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 15:55:09.787460 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 15:55:09.787467 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 15:55:09.787475 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 15:55:09.787481 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 15:55:09.787488 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 15:55:09.787495 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 15:55:09.787501 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 15:55:09.787508 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 15:55:09.787514 systemd[1]: Reached target slices.target - Slice Units. Feb 13 15:55:09.787521 systemd[1]: Reached target swap.target - Swaps. Feb 13 15:55:09.787528 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 15:55:09.787535 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 15:55:09.787542 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Feb 13 15:55:09.787549 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 15:55:09.787555 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 15:55:09.787564 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 15:55:09.787571 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 15:55:09.787577 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 15:55:09.787584 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 15:55:09.787591 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 15:55:09.787598 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:55:09.787604 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 15:55:09.787611 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 15:55:09.787619 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 15:55:09.787627 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 15:55:09.787634 systemd[1]: Reached target machines.target - Containers. Feb 13 15:55:09.787640 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 15:55:09.787647 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Feb 13 15:55:09.787654 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 15:55:09.787660 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 15:55:09.787667 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:55:09.787675 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 15:55:09.787681 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 15:55:09.787688 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 15:55:09.787695 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 15:55:09.787702 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 15:55:09.787709 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 15:55:09.787716 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 15:55:09.787723 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 15:55:09.787729 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 15:55:09.787738 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 15:55:09.787745 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 15:55:09.787752 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 15:55:09.787759 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 15:55:09.787765 kernel: fuse: init (API version 7.39) Feb 13 15:55:09.787772 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 15:55:09.787779 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Feb 13 15:55:09.787785 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 15:55:09.787793 kernel: loop: module loaded Feb 13 15:55:09.787799 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 15:55:09.787806 systemd[1]: Stopped verity-setup.service. Feb 13 15:55:09.787813 kernel: ACPI: bus type drm_connector registered Feb 13 15:55:09.787819 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:55:09.787826 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 15:55:09.787833 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 15:55:09.787840 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 15:55:09.787846 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 15:55:09.787855 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 15:55:09.787874 systemd-journald[1172]: Collecting audit messages is disabled. Feb 13 15:55:09.787890 systemd-journald[1172]: Journal started Feb 13 15:55:09.787906 systemd-journald[1172]: Runtime Journal (/run/log/journal/5b49c49056f2414d8f7812f80aeb4b33) is 4.8M, max 38.6M, 33.8M free. Feb 13 15:55:09.792212 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 15:55:09.792276 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 15:55:09.618044 systemd[1]: Queued start job for default target multi-user.target. Feb 13 15:55:09.629446 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Feb 13 15:55:09.629706 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 15:55:09.796797 jq[1145]: true Feb 13 15:55:09.797201 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 15:55:09.797953 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 15:55:09.798200 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 15:55:09.798300 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 15:55:09.798514 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:55:09.798601 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:55:09.798809 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 15:55:09.798895 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 15:55:09.799282 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 15:55:09.802382 jq[1190]: true Feb 13 15:55:09.800476 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 15:55:09.800700 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 15:55:09.800789 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 15:55:09.800996 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 15:55:09.801081 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 15:55:09.801563 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 15:55:09.801795 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 15:55:09.802027 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 15:55:09.805748 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Feb 13 15:55:09.813354 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 15:55:09.819370 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 15:55:09.823300 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 15:55:09.823409 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 15:55:09.823426 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 15:55:09.824069 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Feb 13 15:55:09.839757 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 15:55:09.854500 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 15:55:09.854669 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:55:09.873130 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 15:55:09.889284 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 15:55:09.889443 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 15:55:09.892326 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 15:55:09.894288 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 15:55:09.895265 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 15:55:09.901055 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 15:55:09.906278 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 15:55:09.907417 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 15:55:09.907675 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 15:55:09.908693 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 15:55:09.913264 systemd-journald[1172]: Time spent on flushing to /var/log/journal/5b49c49056f2414d8f7812f80aeb4b33 is 29.788ms for 1851 entries. Feb 13 15:55:09.913264 systemd-journald[1172]: System Journal (/var/log/journal/5b49c49056f2414d8f7812f80aeb4b33) is 8M, max 584.8M, 576.8M free. Feb 13 15:55:09.964690 systemd-journald[1172]: Received client request to flush runtime journal. Feb 13 15:55:09.965102 kernel: loop0: detected capacity change from 0 to 147912 Feb 13 15:55:09.909220 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 15:55:09.909571 ignition[1192]: Ignition 2.20.0 Feb 13 15:55:09.909494 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 15:55:09.909772 ignition[1192]: deleting config from guestinfo properties Feb 13 15:55:09.918097 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Feb 13 15:55:09.914717 ignition[1192]: Successfully deleted config Feb 13 15:55:09.919023 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 15:55:09.931151 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Feb 13 15:55:09.933903 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 15:55:09.950284 udevadm[1234]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 15:55:09.968214 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 15:55:09.971069 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. Feb 13 15:55:09.971080 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. Feb 13 15:55:09.973407 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 15:55:09.976399 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Feb 13 15:55:09.977120 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 15:55:09.983274 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 15:55:10.026240 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 15:55:10.042572 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 15:55:10.047213 kernel: loop1: detected capacity change from 0 to 2960 Feb 13 15:55:10.049366 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 15:55:10.060607 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Feb 13 15:55:10.060792 systemd-tmpfiles[1250]: ACLs are not supported, ignoring. Feb 13 15:55:10.064655 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 15:55:10.080238 kernel: loop2: detected capacity change from 0 to 210664 Feb 13 15:55:10.148198 kernel: loop3: detected capacity change from 0 to 138176 Feb 13 15:55:10.205255 kernel: loop4: detected capacity change from 0 to 147912 Feb 13 15:55:10.230412 kernel: loop5: detected capacity change from 0 to 2960 Feb 13 15:55:10.258212 kernel: loop6: detected capacity change from 0 to 210664 Feb 13 15:55:10.287348 kernel: loop7: detected capacity change from 0 to 138176 Feb 13 15:55:10.312391 (sd-merge)[1258]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Feb 13 15:55:10.313017 (sd-merge)[1258]: Merged extensions into '/usr'. Feb 13 15:55:10.315908 systemd[1]: Reload requested from client PID 1225 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 15:55:10.316060 systemd[1]: Reloading... Feb 13 15:55:10.390194 ldconfig[1216]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 15:55:10.395329 zram_generator::config[1282]: No configuration found. Feb 13 15:55:10.456941 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:55:10.474699 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:55:10.518260 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 15:55:10.518523 systemd[1]: Reloading finished in 201 ms. Feb 13 15:55:10.536951 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 15:55:10.537332 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 15:55:10.537590 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 15:55:10.542086 systemd[1]: Starting ensure-sysext.service... Feb 13 15:55:10.544265 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 15:55:10.546413 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 15:55:10.553721 systemd[1]: Reload requested from client PID 1343 ('systemctl') (unit ensure-sysext.service)... Feb 13 15:55:10.553730 systemd[1]: Reloading... Feb 13 15:55:10.567076 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 15:55:10.567317 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 15:55:10.567782 systemd-tmpfiles[1344]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 15:55:10.567938 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Feb 13 15:55:10.567976 systemd-tmpfiles[1344]: ACLs are not supported, ignoring. Feb 13 15:55:10.569880 systemd-udevd[1345]: Using default interface naming scheme 'v255'. Feb 13 15:55:10.574426 systemd-tmpfiles[1344]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 15:55:10.574434 systemd-tmpfiles[1344]: Skipping /boot Feb 13 15:55:10.586607 systemd-tmpfiles[1344]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 15:55:10.586613 systemd-tmpfiles[1344]: Skipping /boot Feb 13 15:55:10.588216 zram_generator::config[1371]: No configuration found. Feb 13 15:55:10.702286 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:55:10.704325 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input3 Feb 13 15:55:10.709207 kernel: ACPI: button: Power Button [PWRF] Feb 13 15:55:10.732108 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:55:10.745190 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1396) Feb 13 15:55:10.792880 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 15:55:10.793097 systemd[1]: Reloading finished in 239 ms. Feb 13 15:55:10.799435 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 15:55:10.806905 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 15:55:10.817210 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Feb 13 15:55:10.841834 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 15:55:10.847348 systemd[1]: Finished ensure-sysext.service. Feb 13 15:55:10.849900 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:55:10.857680 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 15:55:10.859247 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input4 Feb 13 15:55:10.861256 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 15:55:10.862319 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 15:55:10.862389 (udev-worker)[1386]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Feb 13 15:55:10.865027 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 15:55:10.871972 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 15:55:10.869267 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 15:55:10.872312 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 15:55:10.872497 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 15:55:10.874293 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 15:55:10.874411 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Feb 13 15:55:10.877116 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 15:55:10.880113 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 15:55:10.881526 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 15:55:10.883913 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 15:55:10.885322 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 15:55:10.885490 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 15:55:10.887723 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 15:55:10.887844 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 15:55:10.888067 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 15:55:10.888164 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 15:55:10.888454 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 15:55:10.888544 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 15:55:10.893704 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 15:55:10.894271 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 15:55:10.897093 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 15:55:10.898135 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 15:55:10.898941 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 15:55:10.903393 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 15:55:10.912397 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 15:55:10.912824 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 15:55:10.920736 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 15:55:10.921132 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 15:55:10.928359 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 15:55:10.934271 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 15:55:10.945261 lvm[1502]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 15:55:10.949742 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 15:55:10.950984 augenrules[1507]: No rules Feb 13 15:55:10.953832 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 15:55:10.953984 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 15:55:10.955236 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 15:55:10.972883 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 15:55:10.973487 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 15:55:10.980362 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 15:55:10.985582 lvm[1523]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 15:55:11.011358 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 15:55:11.015586 systemd-networkd[1473]: lo: Link UP Feb 13 15:55:11.015590 systemd-networkd[1473]: lo: Gained carrier Feb 13 15:55:11.016691 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 15:55:11.017389 systemd-networkd[1473]: Enumeration completed Feb 13 15:55:11.017596 systemd-networkd[1473]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Feb 13 15:55:11.017598 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 15:55:11.020329 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 15:55:11.020452 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 15:55:11.021366 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Feb 13 15:55:11.023277 systemd-networkd[1473]: ens192: Link UP Feb 13 15:55:11.023365 systemd-networkd[1473]: ens192: Gained carrier Feb 13 15:55:11.026705 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 15:55:11.027324 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 15:55:11.028540 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 15:55:11.032434 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 15:55:11.032851 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 15:55:11.048387 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Feb 13 15:55:11.050364 systemd-resolved[1476]: Positive Trust Anchors: Feb 13 15:55:11.050507 systemd-resolved[1476]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 15:55:11.050563 systemd-resolved[1476]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 15:55:11.053092 systemd-resolved[1476]: Defaulting to hostname 'linux'. Feb 13 15:55:11.054125 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 15:55:11.054278 systemd[1]: Reached target network.target - Network. Feb 13 15:55:11.054365 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 15:55:11.054476 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 15:55:11.054619 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 15:55:11.054737 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 15:55:11.054926 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 15:55:11.055062 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 15:55:11.055166 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 15:55:11.055295 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 15:55:11.055314 systemd[1]: Reached target paths.target - Path Units. Feb 13 15:55:11.055391 systemd[1]: Reached target timers.target - Timer Units. Feb 13 15:55:11.056188 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 15:55:11.057238 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 15:55:11.058785 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Feb 13 15:55:11.058967 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Feb 13 15:55:11.059079 systemd[1]: Reached target ssh-access.target - SSH Access Available. Feb 13 15:55:11.061275 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 15:55:11.061563 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Feb 13 15:55:11.062004 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 15:55:11.062136 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 15:55:11.062229 systemd[1]: Reached target basic.target - Basic System. Feb 13 15:55:11.062340 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 15:55:11.062358 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 15:55:11.062988 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 15:55:11.075636 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 15:55:11.076492 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 15:55:11.079123 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 15:55:11.079653 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 15:55:11.081315 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 15:55:11.082432 jq[1537]: false Feb 13 15:55:11.089410 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Feb 13 15:55:11.091030 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 15:55:11.092250 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 15:55:11.094743 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 15:55:11.096252 extend-filesystems[1538]: Found loop4 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found loop5 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found loop6 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found loop7 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda1 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda2 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda3 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found usr Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda4 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda6 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda7 Feb 13 15:55:11.096252 extend-filesystems[1538]: Found sda9 Feb 13 15:55:11.096252 extend-filesystems[1538]: Checking size of /dev/sda9 Feb 13 15:55:11.095704 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 15:55:11.096106 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 15:55:11.097556 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 15:55:11.100322 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 15:55:11.101188 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Feb 13 15:55:11.102440 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 15:55:11.102589 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 15:55:11.119458 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 15:55:11.119601 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 15:55:11.123110 dbus-daemon[1536]: [system] SELinux support is enabled Feb 13 15:55:11.125092 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 15:55:11.129724 extend-filesystems[1538]: Old size kept for /dev/sda9 Feb 13 15:55:11.129724 extend-filesystems[1538]: Found sr0 Feb 13 15:55:11.128364 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 15:55:11.128479 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 15:55:11.130347 jq[1549]: true Feb 13 15:55:11.128965 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 15:55:11.128992 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 15:55:11.129347 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 15:55:11.129358 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 15:55:11.134934 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 15:55:11.135058 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 15:55:11.145409 (ntainerd)[1564]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 15:55:11.146341 update_engine[1547]: I20250213 15:55:11.146245 1547 main.cc:92] Flatcar Update Engine starting Feb 13 15:55:11.152382 tar[1552]: linux-amd64/helm Feb 13 15:55:11.153412 update_engine[1547]: I20250213 15:55:11.153383 1547 update_check_scheduler.cc:74] Next update check in 5m37s Feb 13 15:55:11.157340 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Feb 13 15:55:11.157654 systemd[1]: Started update-engine.service - Update Engine. Feb 13 15:55:11.163943 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Feb 13 15:55:11.164825 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 15:55:11.167822 jq[1569]: true Feb 13 15:56:25.814965 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (1411) Feb 13 15:56:25.814921 systemd-timesyncd[1478]: Contacted time server 108.61.73.244:123 (0.flatcar.pool.ntp.org). Feb 13 15:56:25.814949 systemd-timesyncd[1478]: Initial clock synchronization to Thu 2025-02-13 15:56:25.814184 UTC. Feb 13 15:56:25.815509 systemd-resolved[1476]: Clock change detected. Flushing caches. Feb 13 15:56:25.819159 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Feb 13 15:56:25.831194 systemd-logind[1544]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 15:56:25.831212 systemd-logind[1544]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 15:56:25.834431 systemd-logind[1544]: New seat seat0. Feb 13 15:56:25.840458 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 15:56:25.870726 unknown[1577]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Feb 13 15:56:25.875299 unknown[1577]: Core dump limit set to -1 Feb 13 15:56:25.926593 locksmithd[1579]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 15:56:25.936611 bash[1599]: Updated "/home/core/.ssh/authorized_keys" Feb 13 15:56:25.937971 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 15:56:25.938510 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Feb 13 15:56:25.961059 sshd_keygen[1573]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 15:56:25.983800 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 15:56:25.991043 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 15:56:25.996579 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 15:56:25.996707 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 15:56:26.006292 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 15:56:26.019264 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 15:56:26.027771 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 15:56:26.031228 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 15:56:26.032077 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 15:56:26.074680 containerd[1564]: time="2025-02-13T15:56:26.074625510Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 15:56:26.107179 containerd[1564]: time="2025-02-13T15:56:26.106755561Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:56:26.107607 containerd[1564]: time="2025-02-13T15:56:26.107591368Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:56:26.107815 containerd[1564]: time="2025-02-13T15:56:26.107807129Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 15:56:26.107863 containerd[1564]: time="2025-02-13T15:56:26.107855538Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 15:56:26.107977 containerd[1564]: time="2025-02-13T15:56:26.107968307Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 15:56:26.108288 containerd[1564]: time="2025-02-13T15:56:26.108280263Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 15:56:26.108366 containerd[1564]: time="2025-02-13T15:56:26.108356043Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:56:26.108415 containerd[1564]: time="2025-02-13T15:56:26.108408112Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:56:26.108707 containerd[1564]: time="2025-02-13T15:56:26.108695932Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:56:26.108919 containerd[1564]: time="2025-02-13T15:56:26.108910549Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 15:56:26.108974 containerd[1564]: time="2025-02-13T15:56:26.108966191Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:56:26.109155 containerd[1564]: time="2025-02-13T15:56:26.109147541Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 15:56:26.109234 containerd[1564]: time="2025-02-13T15:56:26.109224923Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:56:26.109376 containerd[1564]: time="2025-02-13T15:56:26.109367218Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 15:56:26.109799 containerd[1564]: time="2025-02-13T15:56:26.109788586Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 15:56:26.109988 containerd[1564]: time="2025-02-13T15:56:26.109979691Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 15:56:26.110064 containerd[1564]: time="2025-02-13T15:56:26.110055212Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 15:56:26.110302 containerd[1564]: time="2025-02-13T15:56:26.110292956Z" level=info msg="metadata content store policy set" policy=shared Feb 13 15:56:26.116869 containerd[1564]: time="2025-02-13T15:56:26.116856780Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 15:56:26.117079 containerd[1564]: time="2025-02-13T15:56:26.117069701Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 15:56:26.117118 containerd[1564]: time="2025-02-13T15:56:26.117111151Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 15:56:26.117175 containerd[1564]: time="2025-02-13T15:56:26.117167541Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 15:56:26.117332 containerd[1564]: time="2025-02-13T15:56:26.117325032Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 15:56:26.117460 containerd[1564]: time="2025-02-13T15:56:26.117435227Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 15:56:26.117834 containerd[1564]: time="2025-02-13T15:56:26.117822706Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 15:56:26.118425 containerd[1564]: time="2025-02-13T15:56:26.118415493Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 15:56:26.118658 containerd[1564]: time="2025-02-13T15:56:26.118648664Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 15:56:26.118703 containerd[1564]: time="2025-02-13T15:56:26.118695669Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 15:56:26.118737 containerd[1564]: time="2025-02-13T15:56:26.118730788Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.118775 containerd[1564]: time="2025-02-13T15:56:26.118768203Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.118824 containerd[1564]: time="2025-02-13T15:56:26.118817672Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119249270Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119260947Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119268875Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119275423Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119281647Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119292613Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119302186Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119308574Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119315425Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119321606Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119328532Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119335287Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119341662Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119560 containerd[1564]: time="2025-02-13T15:56:26.119348832Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119360852Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119367689Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119374395Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119381733Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119389326Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119400424Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119407781Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119413274Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119441626Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119451252Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119456717Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119463081Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 15:56:26.119752 containerd[1564]: time="2025-02-13T15:56:26.119468239Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.119914 containerd[1564]: time="2025-02-13T15:56:26.119475195Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 15:56:26.119914 containerd[1564]: time="2025-02-13T15:56:26.119480995Z" level=info msg="NRI interface is disabled by configuration." Feb 13 15:56:26.119914 containerd[1564]: time="2025-02-13T15:56:26.119486780Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 15:56:26.121120 containerd[1564]: time="2025-02-13T15:56:26.120612880Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 15:56:26.121120 containerd[1564]: time="2025-02-13T15:56:26.120642939Z" level=info msg="Connect containerd service" Feb 13 15:56:26.121120 containerd[1564]: time="2025-02-13T15:56:26.120660156Z" level=info msg="using legacy CRI server" Feb 13 15:56:26.121120 containerd[1564]: time="2025-02-13T15:56:26.120664711Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 15:56:26.121120 containerd[1564]: time="2025-02-13T15:56:26.120717686Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 15:56:26.121120 containerd[1564]: time="2025-02-13T15:56:26.121010877Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 15:56:26.121586 containerd[1564]: time="2025-02-13T15:56:26.121271870Z" level=info msg="Start subscribing containerd event" Feb 13 15:56:26.121586 containerd[1564]: time="2025-02-13T15:56:26.121457732Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 15:56:26.121586 containerd[1564]: time="2025-02-13T15:56:26.121489251Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 15:56:26.121674 containerd[1564]: time="2025-02-13T15:56:26.121600049Z" level=info msg="Start recovering state" Feb 13 15:56:26.121674 containerd[1564]: time="2025-02-13T15:56:26.121671066Z" level=info msg="Start event monitor" Feb 13 15:56:26.121702 containerd[1564]: time="2025-02-13T15:56:26.121680894Z" level=info msg="Start snapshots syncer" Feb 13 15:56:26.121702 containerd[1564]: time="2025-02-13T15:56:26.121686309Z" level=info msg="Start cni network conf syncer for default" Feb 13 15:56:26.121702 containerd[1564]: time="2025-02-13T15:56:26.121693473Z" level=info msg="Start streaming server" Feb 13 15:56:26.121769 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 15:56:26.122268 containerd[1564]: time="2025-02-13T15:56:26.122130214Z" level=info msg="containerd successfully booted in 0.048159s" Feb 13 15:56:26.228382 tar[1552]: linux-amd64/LICENSE Feb 13 15:56:26.228382 tar[1552]: linux-amd64/README.md Feb 13 15:56:26.240050 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Feb 13 15:56:27.699742 systemd-networkd[1473]: ens192: Gained IPv6LL Feb 13 15:56:27.700992 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 15:56:27.701989 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 15:56:27.707898 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Feb 13 15:56:27.709498 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:56:27.711716 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 15:56:27.732093 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 15:56:27.738531 systemd[1]: coreos-metadata.service: Deactivated successfully. Feb 13 15:56:27.738668 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Feb 13 15:56:27.739117 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 15:56:28.453267 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:56:28.453949 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 15:56:28.454168 systemd[1]: Startup finished in 986ms (kernel) + 6.443s (initrd) + 4.776s (userspace) = 12.207s. Feb 13 15:56:28.459711 (kubelet)[1712]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 15:56:28.490665 login[1641]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 15:56:28.491811 login[1644]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 15:56:28.499481 systemd-logind[1544]: New session 2 of user core. Feb 13 15:56:28.500333 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 15:56:28.509696 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 15:56:28.513995 systemd-logind[1544]: New session 1 of user core. Feb 13 15:56:28.519169 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 15:56:28.523799 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 15:56:28.525241 (systemd)[1719]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 15:56:28.526953 systemd-logind[1544]: New session c1 of user core. Feb 13 15:56:28.609130 systemd[1719]: Queued start job for default target default.target. Feb 13 15:56:28.613321 systemd[1719]: Created slice app.slice - User Application Slice. Feb 13 15:56:28.613338 systemd[1719]: Reached target paths.target - Paths. Feb 13 15:56:28.613360 systemd[1719]: Reached target timers.target - Timers. Feb 13 15:56:28.614082 systemd[1719]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 15:56:28.621450 systemd[1719]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 15:56:28.621529 systemd[1719]: Reached target sockets.target - Sockets. Feb 13 15:56:28.621663 systemd[1719]: Reached target basic.target - Basic System. Feb 13 15:56:28.622302 systemd[1719]: Reached target default.target - Main User Target. Feb 13 15:56:28.622317 systemd[1719]: Startup finished in 91ms. Feb 13 15:56:28.622319 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 15:56:28.624063 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 15:56:28.624630 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 15:56:28.975132 kubelet[1712]: E0213 15:56:28.975076 1712 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 15:56:28.976479 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 15:56:28.976578 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 15:56:28.976884 systemd[1]: kubelet.service: Consumed 601ms CPU time, 246.3M memory peak. Feb 13 15:56:31.421001 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 15:56:31.422306 systemd[1]: Started sshd@0-139.178.70.106:22-51.178.43.161:60146.service - OpenSSH per-connection server daemon (51.178.43.161:60146). Feb 13 15:56:32.262053 sshd[1755]: Invalid user pupkin from 51.178.43.161 port 60146 Feb 13 15:56:32.415625 sshd[1755]: Received disconnect from 51.178.43.161 port 60146:11: Bye Bye [preauth] Feb 13 15:56:32.415625 sshd[1755]: Disconnected from invalid user pupkin 51.178.43.161 port 60146 [preauth] Feb 13 15:56:32.416611 systemd[1]: sshd@0-139.178.70.106:22-51.178.43.161:60146.service: Deactivated successfully. Feb 13 15:56:39.227126 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 15:56:39.233710 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:56:39.576919 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:56:39.579705 (kubelet)[1767]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 15:56:39.615165 kubelet[1767]: E0213 15:56:39.615132 1767 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 15:56:39.617617 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 15:56:39.617791 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 15:56:39.618089 systemd[1]: kubelet.service: Consumed 87ms CPU time, 97.6M memory peak. Feb 13 15:56:44.361750 systemd[1]: Started sshd@1-139.178.70.106:22-185.213.165.55:53266.service - OpenSSH per-connection server daemon (185.213.165.55:53266). Feb 13 15:56:45.624766 sshd[1776]: Invalid user beatrice from 185.213.165.55 port 53266 Feb 13 15:56:45.861615 sshd[1776]: Received disconnect from 185.213.165.55 port 53266:11: Bye Bye [preauth] Feb 13 15:56:45.861615 sshd[1776]: Disconnected from invalid user beatrice 185.213.165.55 port 53266 [preauth] Feb 13 15:56:45.862591 systemd[1]: sshd@1-139.178.70.106:22-185.213.165.55:53266.service: Deactivated successfully. Feb 13 15:56:49.868076 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Feb 13 15:56:49.877688 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:56:50.057229 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:56:50.059438 (kubelet)[1788]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 15:56:50.090815 kubelet[1788]: E0213 15:56:50.090790 1788 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 15:56:50.092449 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 15:56:50.092596 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 15:56:50.092933 systemd[1]: kubelet.service: Consumed 71ms CPU time, 96M memory peak. Feb 13 15:56:57.042600 systemd[1]: Started sshd@2-139.178.70.106:22-187.62.205.20:54268.service - OpenSSH per-connection server daemon (187.62.205.20:54268). Feb 13 15:56:58.134587 sshd[1797]: Invalid user vds from 187.62.205.20 port 54268 Feb 13 15:56:58.337230 sshd[1797]: Received disconnect from 187.62.205.20 port 54268:11: Bye Bye [preauth] Feb 13 15:56:58.337230 sshd[1797]: Disconnected from invalid user vds 187.62.205.20 port 54268 [preauth] Feb 13 15:56:58.337932 systemd[1]: sshd@2-139.178.70.106:22-187.62.205.20:54268.service: Deactivated successfully. Feb 13 15:57:00.235817 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Feb 13 15:57:00.246874 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:00.581787 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:00.593839 (kubelet)[1809]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 15:57:00.652195 kubelet[1809]: E0213 15:57:00.651933 1809 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 15:57:00.654161 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 15:57:00.654363 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 15:57:00.654754 systemd[1]: kubelet.service: Consumed 95ms CPU time, 97.8M memory peak. Feb 13 15:57:05.943868 systemd[1]: Started sshd@3-139.178.70.106:22-147.75.109.163:47770.service - OpenSSH per-connection server daemon (147.75.109.163:47770). Feb 13 15:57:05.979396 sshd[1818]: Accepted publickey for core from 147.75.109.163 port 47770 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:57:05.980233 sshd-session[1818]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:05.983582 systemd-logind[1544]: New session 3 of user core. Feb 13 15:57:05.993768 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 15:57:06.057697 systemd[1]: Started sshd@4-139.178.70.106:22-147.75.109.163:47782.service - OpenSSH per-connection server daemon (147.75.109.163:47782). Feb 13 15:57:06.089094 sshd[1823]: Accepted publickey for core from 147.75.109.163 port 47782 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:57:06.089754 sshd-session[1823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:06.093115 systemd-logind[1544]: New session 4 of user core. Feb 13 15:57:06.098750 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 15:57:06.147253 sshd[1825]: Connection closed by 147.75.109.163 port 47782 Feb 13 15:57:06.147209 sshd-session[1823]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:06.156113 systemd[1]: sshd@4-139.178.70.106:22-147.75.109.163:47782.service: Deactivated successfully. Feb 13 15:57:06.157105 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 15:57:06.157630 systemd-logind[1544]: Session 4 logged out. Waiting for processes to exit. Feb 13 15:57:06.160810 systemd[1]: Started sshd@5-139.178.70.106:22-147.75.109.163:47794.service - OpenSSH per-connection server daemon (147.75.109.163:47794). Feb 13 15:57:06.162746 systemd-logind[1544]: Removed session 4. Feb 13 15:57:06.197887 sshd[1830]: Accepted publickey for core from 147.75.109.163 port 47794 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:57:06.198656 sshd-session[1830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:06.202452 systemd-logind[1544]: New session 5 of user core. Feb 13 15:57:06.208664 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 15:57:06.255539 sshd[1833]: Connection closed by 147.75.109.163 port 47794 Feb 13 15:57:06.255867 sshd-session[1830]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:06.267050 systemd[1]: sshd@5-139.178.70.106:22-147.75.109.163:47794.service: Deactivated successfully. Feb 13 15:57:06.268318 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 15:57:06.269074 systemd-logind[1544]: Session 5 logged out. Waiting for processes to exit. Feb 13 15:57:06.275753 systemd[1]: Started sshd@6-139.178.70.106:22-147.75.109.163:47802.service - OpenSSH per-connection server daemon (147.75.109.163:47802). Feb 13 15:57:06.276176 systemd-logind[1544]: Removed session 5. Feb 13 15:57:06.312800 sshd[1838]: Accepted publickey for core from 147.75.109.163 port 47802 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:57:06.313662 sshd-session[1838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:06.317589 systemd-logind[1544]: New session 6 of user core. Feb 13 15:57:06.322686 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 15:57:06.371564 sshd[1841]: Connection closed by 147.75.109.163 port 47802 Feb 13 15:57:06.371359 sshd-session[1838]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:06.380296 systemd[1]: sshd@6-139.178.70.106:22-147.75.109.163:47802.service: Deactivated successfully. Feb 13 15:57:06.381349 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 15:57:06.382437 systemd-logind[1544]: Session 6 logged out. Waiting for processes to exit. Feb 13 15:57:06.383363 systemd[1]: Started sshd@7-139.178.70.106:22-147.75.109.163:47814.service - OpenSSH per-connection server daemon (147.75.109.163:47814). Feb 13 15:57:06.385047 systemd-logind[1544]: Removed session 6. Feb 13 15:57:06.420922 sshd[1846]: Accepted publickey for core from 147.75.109.163 port 47814 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:57:06.421564 sshd-session[1846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:06.424989 systemd-logind[1544]: New session 7 of user core. Feb 13 15:57:06.430770 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 15:57:06.527609 sudo[1850]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 15:57:06.527827 sudo[1850]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:06.541191 sudo[1850]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:06.542068 sshd[1849]: Connection closed by 147.75.109.163 port 47814 Feb 13 15:57:06.543126 sshd-session[1846]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:06.551941 systemd[1]: sshd@7-139.178.70.106:22-147.75.109.163:47814.service: Deactivated successfully. Feb 13 15:57:06.553012 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 15:57:06.553477 systemd-logind[1544]: Session 7 logged out. Waiting for processes to exit. Feb 13 15:57:06.556829 systemd[1]: Started sshd@8-139.178.70.106:22-147.75.109.163:47830.service - OpenSSH per-connection server daemon (147.75.109.163:47830). Feb 13 15:57:06.557900 systemd-logind[1544]: Removed session 7. Feb 13 15:57:06.592486 sshd[1855]: Accepted publickey for core from 147.75.109.163 port 47830 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:57:06.593374 sshd-session[1855]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:06.596560 systemd-logind[1544]: New session 8 of user core. Feb 13 15:57:06.605746 systemd[1]: Started session-8.scope - Session 8 of User core. Feb 13 15:57:06.654314 sudo[1860]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 15:57:06.654731 sudo[1860]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:06.657138 sudo[1860]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:06.660823 sudo[1859]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 15:57:06.661016 sudo[1859]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:06.672053 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 15:57:06.692920 augenrules[1882]: No rules Feb 13 15:57:06.693263 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 15:57:06.693443 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 15:57:06.694024 sudo[1859]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:06.695090 sshd[1858]: Connection closed by 147.75.109.163 port 47830 Feb 13 15:57:06.695448 sshd-session[1855]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:06.705291 systemd[1]: sshd@8-139.178.70.106:22-147.75.109.163:47830.service: Deactivated successfully. Feb 13 15:57:06.706353 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 15:57:06.707248 systemd-logind[1544]: Session 8 logged out. Waiting for processes to exit. Feb 13 15:57:06.708155 systemd[1]: Started sshd@9-139.178.70.106:22-147.75.109.163:47840.service - OpenSSH per-connection server daemon (147.75.109.163:47840). Feb 13 15:57:06.709781 systemd-logind[1544]: Removed session 8. Feb 13 15:57:06.745502 sshd[1890]: Accepted publickey for core from 147.75.109.163 port 47840 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:57:06.746360 sshd-session[1890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:57:06.750872 systemd-logind[1544]: New session 9 of user core. Feb 13 15:57:06.756671 systemd[1]: Started session-9.scope - Session 9 of User core. Feb 13 15:57:06.806421 sudo[1894]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 15:57:06.807267 sudo[1894]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 15:57:07.291719 systemd[1]: Starting docker.service - Docker Application Container Engine... Feb 13 15:57:07.291789 (dockerd)[1910]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Feb 13 15:57:07.865405 dockerd[1910]: time="2025-02-13T15:57:07.865364615Z" level=info msg="Starting up" Feb 13 15:57:08.035223 dockerd[1910]: time="2025-02-13T15:57:08.035190844Z" level=info msg="Loading containers: start." Feb 13 15:57:08.220600 kernel: Initializing XFRM netlink socket Feb 13 15:57:08.331801 systemd-networkd[1473]: docker0: Link UP Feb 13 15:57:08.358402 dockerd[1910]: time="2025-02-13T15:57:08.358377667Z" level=info msg="Loading containers: done." Feb 13 15:57:08.367754 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3328309194-merged.mount: Deactivated successfully. Feb 13 15:57:08.368755 dockerd[1910]: time="2025-02-13T15:57:08.368725439Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 15:57:08.368829 dockerd[1910]: time="2025-02-13T15:57:08.368812525Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Feb 13 15:57:08.369010 dockerd[1910]: time="2025-02-13T15:57:08.368994631Z" level=info msg="Daemon has completed initialization" Feb 13 15:57:08.395386 dockerd[1910]: time="2025-02-13T15:57:08.395342726Z" level=info msg="API listen on /run/docker.sock" Feb 13 15:57:08.396069 systemd[1]: Started docker.service - Docker Application Container Engine. Feb 13 15:57:09.676352 containerd[1564]: time="2025-02-13T15:57:09.676282157Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.10\"" Feb 13 15:57:10.430155 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount219396074.mount: Deactivated successfully. Feb 13 15:57:10.738338 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Feb 13 15:57:10.745715 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:10.963171 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:10.965343 (kubelet)[2165]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 15:57:11.006968 kubelet[2165]: E0213 15:57:11.006872 2165 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 15:57:11.007986 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 15:57:11.008066 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 15:57:11.008449 systemd[1]: kubelet.service: Consumed 76ms CPU time, 97.9M memory peak. Feb 13 15:57:11.249689 update_engine[1547]: I20250213 15:57:11.249642 1547 update_attempter.cc:509] Updating boot flags... Feb 13 15:57:11.280567 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2183) Feb 13 15:57:11.332241 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2184) Feb 13 15:57:11.370571 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 40 scanned by (udev-worker) (2184) Feb 13 15:57:11.745488 containerd[1564]: time="2025-02-13T15:57:11.745454973Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:11.746261 containerd[1564]: time="2025-02-13T15:57:11.746237346Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.10: active requests=0, bytes read=32678214" Feb 13 15:57:11.746667 containerd[1564]: time="2025-02-13T15:57:11.746651593Z" level=info msg="ImageCreate event name:\"sha256:172a4e0b731db1008c5339e0b8ef232f5c55632099e37cccfb9ba786c19580c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:11.748107 containerd[1564]: time="2025-02-13T15:57:11.748092018Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:63b2b4b4e9b5dcb5b1b6cec9d5f5f538791a40cd8cb273ef530e6d6535aa0b43\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:11.749104 containerd[1564]: time="2025-02-13T15:57:11.749088737Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.10\" with image id \"sha256:172a4e0b731db1008c5339e0b8ef232f5c55632099e37cccfb9ba786c19580c4\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:63b2b4b4e9b5dcb5b1b6cec9d5f5f538791a40cd8cb273ef530e6d6535aa0b43\", size \"32675014\" in 2.072780461s" Feb 13 15:57:11.749128 containerd[1564]: time="2025-02-13T15:57:11.749109272Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.10\" returns image reference \"sha256:172a4e0b731db1008c5339e0b8ef232f5c55632099e37cccfb9ba786c19580c4\"" Feb 13 15:57:11.761719 containerd[1564]: time="2025-02-13T15:57:11.761692984Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.10\"" Feb 13 15:57:12.648119 systemd[1]: Started sshd@10-139.178.70.106:22-150.138.115.76:52232.service - OpenSSH per-connection server daemon (150.138.115.76:52232). Feb 13 15:57:13.013989 containerd[1564]: time="2025-02-13T15:57:13.013958460Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:13.015175 containerd[1564]: time="2025-02-13T15:57:13.014952852Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.10: active requests=0, bytes read=29611545" Feb 13 15:57:13.015175 containerd[1564]: time="2025-02-13T15:57:13.014992101Z" level=info msg="ImageCreate event name:\"sha256:f81ad4d47d77570472cf20a1f6b008ece135be405b2f52f50ed6820f2b6f9a5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:13.016958 containerd[1564]: time="2025-02-13T15:57:13.016946569Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:99b3336343ea48be24f1e64774825e9f8d5170bd2ed482ff336548eb824f5f58\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:13.018061 containerd[1564]: time="2025-02-13T15:57:13.017985182Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.10\" with image id \"sha256:f81ad4d47d77570472cf20a1f6b008ece135be405b2f52f50ed6820f2b6f9a5f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:99b3336343ea48be24f1e64774825e9f8d5170bd2ed482ff336548eb824f5f58\", size \"31058091\" in 1.256272747s" Feb 13 15:57:13.018061 containerd[1564]: time="2025-02-13T15:57:13.018000818Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.10\" returns image reference \"sha256:f81ad4d47d77570472cf20a1f6b008ece135be405b2f52f50ed6820f2b6f9a5f\"" Feb 13 15:57:13.030288 containerd[1564]: time="2025-02-13T15:57:13.030138239Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.10\"" Feb 13 15:57:14.246210 containerd[1564]: time="2025-02-13T15:57:14.245604682Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:14.246210 containerd[1564]: time="2025-02-13T15:57:14.246011568Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.10: active requests=0, bytes read=17782130" Feb 13 15:57:14.246210 containerd[1564]: time="2025-02-13T15:57:14.246186453Z" level=info msg="ImageCreate event name:\"sha256:64edffde4bf75617ad8fc73556d5e80d34b9425c79106b7f74b2059243b2ffe8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:14.247828 containerd[1564]: time="2025-02-13T15:57:14.247814246Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:cf7eb256192f1f51093fe278c209a9368f0675eb61ed01b148af47d2f21c002d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:14.248486 containerd[1564]: time="2025-02-13T15:57:14.248470482Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.10\" with image id \"sha256:64edffde4bf75617ad8fc73556d5e80d34b9425c79106b7f74b2059243b2ffe8\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:cf7eb256192f1f51093fe278c209a9368f0675eb61ed01b148af47d2f21c002d\", size \"19228694\" in 1.218312934s" Feb 13 15:57:14.248514 containerd[1564]: time="2025-02-13T15:57:14.248487129Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.10\" returns image reference \"sha256:64edffde4bf75617ad8fc73556d5e80d34b9425c79106b7f74b2059243b2ffe8\"" Feb 13 15:57:14.261031 containerd[1564]: time="2025-02-13T15:57:14.261013837Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\"" Feb 13 15:57:14.945175 sshd[2204]: Invalid user fj from 150.138.115.76 port 52232 Feb 13 15:57:15.120777 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount382812420.mount: Deactivated successfully. Feb 13 15:57:15.131426 sshd[2204]: Received disconnect from 150.138.115.76 port 52232:11: Bye Bye [preauth] Feb 13 15:57:15.131426 sshd[2204]: Disconnected from invalid user fj 150.138.115.76 port 52232 [preauth] Feb 13 15:57:15.132140 systemd[1]: sshd@10-139.178.70.106:22-150.138.115.76:52232.service: Deactivated successfully. Feb 13 15:57:15.423575 containerd[1564]: time="2025-02-13T15:57:15.423245873Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:15.428301 containerd[1564]: time="2025-02-13T15:57:15.428269781Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.10: active requests=0, bytes read=29057858" Feb 13 15:57:15.434440 containerd[1564]: time="2025-02-13T15:57:15.434398650Z" level=info msg="ImageCreate event name:\"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:15.442088 containerd[1564]: time="2025-02-13T15:57:15.442046856Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:15.442666 containerd[1564]: time="2025-02-13T15:57:15.442313233Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.10\" with image id \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\", repo tag \"registry.k8s.io/kube-proxy:v1.30.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\", size \"29056877\" in 1.181185406s" Feb 13 15:57:15.442666 containerd[1564]: time="2025-02-13T15:57:15.442334106Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\" returns image reference \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\"" Feb 13 15:57:15.457286 containerd[1564]: time="2025-02-13T15:57:15.457259042Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Feb 13 15:57:16.084614 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount817195287.mount: Deactivated successfully. Feb 13 15:57:16.813492 containerd[1564]: time="2025-02-13T15:57:16.813443756Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:16.823599 containerd[1564]: time="2025-02-13T15:57:16.823565195Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Feb 13 15:57:16.832070 containerd[1564]: time="2025-02-13T15:57:16.832036721Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:16.837949 containerd[1564]: time="2025-02-13T15:57:16.837921673Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:16.839381 containerd[1564]: time="2025-02-13T15:57:16.839269160Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.381982995s" Feb 13 15:57:16.839381 containerd[1564]: time="2025-02-13T15:57:16.839292092Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Feb 13 15:57:16.856937 containerd[1564]: time="2025-02-13T15:57:16.856826978Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Feb 13 15:57:17.601417 systemd[1]: Started sshd@11-139.178.70.106:22-188.94.154.98:38310.service - OpenSSH per-connection server daemon (188.94.154.98:38310). Feb 13 15:57:17.658896 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount231096660.mount: Deactivated successfully. Feb 13 15:57:17.661144 containerd[1564]: time="2025-02-13T15:57:17.661099815Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:17.661510 containerd[1564]: time="2025-02-13T15:57:17.661490806Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Feb 13 15:57:17.661610 containerd[1564]: time="2025-02-13T15:57:17.661597087Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:17.662863 containerd[1564]: time="2025-02-13T15:57:17.662849919Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:17.663346 containerd[1564]: time="2025-02-13T15:57:17.663333012Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 806.416204ms" Feb 13 15:57:17.663397 containerd[1564]: time="2025-02-13T15:57:17.663389516Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Feb 13 15:57:17.675815 containerd[1564]: time="2025-02-13T15:57:17.675789030Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Feb 13 15:57:18.217964 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3303688128.mount: Deactivated successfully. Feb 13 15:57:18.809202 sshd[2283]: Invalid user icosftp from 188.94.154.98 port 38310 Feb 13 15:57:19.042478 sshd[2283]: Received disconnect from 188.94.154.98 port 38310:11: Bye Bye [preauth] Feb 13 15:57:19.042478 sshd[2283]: Disconnected from invalid user icosftp 188.94.154.98 port 38310 [preauth] Feb 13 15:57:19.043589 systemd[1]: sshd@11-139.178.70.106:22-188.94.154.98:38310.service: Deactivated successfully. Feb 13 15:57:21.235687 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Feb 13 15:57:21.241653 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:21.502666 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:21.506131 (kubelet)[2343]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 15:57:21.654908 kubelet[2343]: E0213 15:57:21.654860 2343 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 15:57:21.656407 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 15:57:21.656498 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 15:57:21.656714 systemd[1]: kubelet.service: Consumed 112ms CPU time, 97.7M memory peak. Feb 13 15:57:22.694276 containerd[1564]: time="2025-02-13T15:57:22.693623965Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:22.694276 containerd[1564]: time="2025-02-13T15:57:22.694016908Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238571" Feb 13 15:57:22.694276 containerd[1564]: time="2025-02-13T15:57:22.694249435Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:22.697555 containerd[1564]: time="2025-02-13T15:57:22.697520851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:22.698156 containerd[1564]: time="2025-02-13T15:57:22.698141355Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 5.022330831s" Feb 13 15:57:22.698207 containerd[1564]: time="2025-02-13T15:57:22.698198997Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Feb 13 15:57:24.629310 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:24.629409 systemd[1]: kubelet.service: Consumed 112ms CPU time, 97.7M memory peak. Feb 13 15:57:24.634721 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:24.651590 systemd[1]: Reload requested from client PID 2419 ('systemctl') (unit session-9.scope)... Feb 13 15:57:24.651600 systemd[1]: Reloading... Feb 13 15:57:24.725570 zram_generator::config[2463]: No configuration found. Feb 13 15:57:24.785038 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:57:24.802927 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:57:24.867289 systemd[1]: Reloading finished in 215 ms. Feb 13 15:57:24.903594 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:24.906476 (kubelet)[2525]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 15:57:24.907852 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:24.908103 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 15:57:24.908265 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:24.908336 systemd[1]: kubelet.service: Consumed 49ms CPU time, 84.4M memory peak. Feb 13 15:57:24.913827 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:25.120840 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:25.125073 (kubelet)[2537]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 15:57:25.156611 kubelet[2537]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 15:57:25.156611 kubelet[2537]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 15:57:25.156611 kubelet[2537]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 15:57:25.157558 kubelet[2537]: I0213 15:57:25.156843 2537 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 15:57:25.379781 kubelet[2537]: I0213 15:57:25.379758 2537 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Feb 13 15:57:25.379781 kubelet[2537]: I0213 15:57:25.379776 2537 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 15:57:25.379915 kubelet[2537]: I0213 15:57:25.379904 2537 server.go:927] "Client rotation is on, will bootstrap in background" Feb 13 15:57:25.421387 kubelet[2537]: I0213 15:57:25.421315 2537 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 15:57:25.424293 kubelet[2537]: E0213 15:57:25.423981 2537 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.438802 kubelet[2537]: I0213 15:57:25.438790 2537 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 15:57:25.439034 kubelet[2537]: I0213 15:57:25.439017 2537 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 15:57:25.440209 kubelet[2537]: I0213 15:57:25.439069 2537 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Feb 13 15:57:25.440708 kubelet[2537]: I0213 15:57:25.440700 2537 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 15:57:25.440875 kubelet[2537]: I0213 15:57:25.440744 2537 container_manager_linux.go:301] "Creating device plugin manager" Feb 13 15:57:25.440875 kubelet[2537]: I0213 15:57:25.440809 2537 state_mem.go:36] "Initialized new in-memory state store" Feb 13 15:57:25.441579 kubelet[2537]: I0213 15:57:25.441500 2537 kubelet.go:400] "Attempting to sync node with API server" Feb 13 15:57:25.441579 kubelet[2537]: I0213 15:57:25.441511 2537 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 15:57:25.441854 kubelet[2537]: W0213 15:57:25.441812 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.441854 kubelet[2537]: E0213 15:57:25.441843 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.442003 kubelet[2537]: I0213 15:57:25.441991 2537 kubelet.go:312] "Adding apiserver pod source" Feb 13 15:57:25.442025 kubelet[2537]: I0213 15:57:25.442005 2537 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 15:57:25.446271 kubelet[2537]: W0213 15:57:25.446242 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.446306 kubelet[2537]: E0213 15:57:25.446273 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.449183 kubelet[2537]: I0213 15:57:25.449170 2537 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 15:57:25.457094 kubelet[2537]: I0213 15:57:25.457078 2537 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 15:57:25.466153 kubelet[2537]: W0213 15:57:25.466134 2537 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 15:57:25.474130 kubelet[2537]: I0213 15:57:25.474114 2537 server.go:1264] "Started kubelet" Feb 13 15:57:25.479698 kubelet[2537]: I0213 15:57:25.479411 2537 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 15:57:25.486748 kubelet[2537]: I0213 15:57:25.486732 2537 server.go:455] "Adding debug handlers to kubelet server" Feb 13 15:57:25.487144 kubelet[2537]: I0213 15:57:25.487101 2537 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 15:57:25.487366 kubelet[2537]: I0213 15:57:25.487350 2537 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 15:57:25.488414 kubelet[2537]: E0213 15:57:25.487682 2537 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.106:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.106:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1823cfb239afddd3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-02-13 15:57:25.474098643 +0000 UTC m=+0.346582228,LastTimestamp:2025-02-13 15:57:25.474098643 +0000 UTC m=+0.346582228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Feb 13 15:57:25.488414 kubelet[2537]: I0213 15:57:25.488333 2537 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 15:57:25.494051 kubelet[2537]: E0213 15:57:25.494035 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:25.494321 kubelet[2537]: I0213 15:57:25.494310 2537 volume_manager.go:291] "Starting Kubelet Volume Manager" Feb 13 15:57:25.494447 kubelet[2537]: I0213 15:57:25.494438 2537 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 15:57:25.495561 kubelet[2537]: I0213 15:57:25.495540 2537 reconciler.go:26] "Reconciler: start to sync state" Feb 13 15:57:25.496941 kubelet[2537]: W0213 15:57:25.496915 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.497011 kubelet[2537]: E0213 15:57:25.497002 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.497100 kubelet[2537]: E0213 15:57:25.497085 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="200ms" Feb 13 15:57:25.497266 kubelet[2537]: I0213 15:57:25.497257 2537 factory.go:221] Registration of the systemd container factory successfully Feb 13 15:57:25.497358 kubelet[2537]: I0213 15:57:25.497347 2537 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 15:57:25.498671 kubelet[2537]: I0213 15:57:25.498660 2537 factory.go:221] Registration of the containerd container factory successfully Feb 13 15:57:25.501299 kubelet[2537]: E0213 15:57:25.498672 2537 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 15:57:25.507424 kubelet[2537]: I0213 15:57:25.507398 2537 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 15:57:25.508689 kubelet[2537]: I0213 15:57:25.508675 2537 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 15:57:25.508721 kubelet[2537]: I0213 15:57:25.508694 2537 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 15:57:25.508721 kubelet[2537]: I0213 15:57:25.508706 2537 kubelet.go:2337] "Starting kubelet main sync loop" Feb 13 15:57:25.508755 kubelet[2537]: E0213 15:57:25.508729 2537 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 15:57:25.512202 kubelet[2537]: W0213 15:57:25.512121 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.512202 kubelet[2537]: E0213 15:57:25.512152 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:25.516181 kubelet[2537]: I0213 15:57:25.516142 2537 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 15:57:25.516181 kubelet[2537]: I0213 15:57:25.516151 2537 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 15:57:25.516181 kubelet[2537]: I0213 15:57:25.516162 2537 state_mem.go:36] "Initialized new in-memory state store" Feb 13 15:57:25.517091 kubelet[2537]: I0213 15:57:25.517080 2537 policy_none.go:49] "None policy: Start" Feb 13 15:57:25.517410 kubelet[2537]: I0213 15:57:25.517396 2537 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 15:57:25.517459 kubelet[2537]: I0213 15:57:25.517413 2537 state_mem.go:35] "Initializing new in-memory state store" Feb 13 15:57:25.521540 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 15:57:25.529691 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 15:57:25.531821 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 15:57:25.540271 kubelet[2537]: I0213 15:57:25.539942 2537 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 15:57:25.540271 kubelet[2537]: I0213 15:57:25.540078 2537 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 15:57:25.541444 kubelet[2537]: E0213 15:57:25.541415 2537 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Feb 13 15:57:25.543185 kubelet[2537]: I0213 15:57:25.543174 2537 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 15:57:25.595871 kubelet[2537]: I0213 15:57:25.595856 2537 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 15:57:25.596207 kubelet[2537]: E0213 15:57:25.596196 2537 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 15:57:25.609533 kubelet[2537]: I0213 15:57:25.609511 2537 topology_manager.go:215] "Topology Admit Handler" podUID="3b62cf660082c334f4e64456b501db20" podNamespace="kube-system" podName="kube-apiserver-localhost" Feb 13 15:57:25.610216 kubelet[2537]: I0213 15:57:25.610205 2537 topology_manager.go:215] "Topology Admit Handler" podUID="dd3721fb1a67092819e35b40473f4063" podNamespace="kube-system" podName="kube-controller-manager-localhost" Feb 13 15:57:25.611776 kubelet[2537]: I0213 15:57:25.611762 2537 topology_manager.go:215] "Topology Admit Handler" podUID="8d610d6c43052dbc8df47eb68906a982" podNamespace="kube-system" podName="kube-scheduler-localhost" Feb 13 15:57:25.615655 systemd[1]: Created slice kubepods-burstable-pod3b62cf660082c334f4e64456b501db20.slice - libcontainer container kubepods-burstable-pod3b62cf660082c334f4e64456b501db20.slice. Feb 13 15:57:25.635185 systemd[1]: Created slice kubepods-burstable-poddd3721fb1a67092819e35b40473f4063.slice - libcontainer container kubepods-burstable-poddd3721fb1a67092819e35b40473f4063.slice. Feb 13 15:57:25.637455 systemd[1]: Created slice kubepods-burstable-pod8d610d6c43052dbc8df47eb68906a982.slice - libcontainer container kubepods-burstable-pod8d610d6c43052dbc8df47eb68906a982.slice. Feb 13 15:57:25.697791 kubelet[2537]: I0213 15:57:25.696852 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:25.697791 kubelet[2537]: I0213 15:57:25.697504 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:25.697791 kubelet[2537]: I0213 15:57:25.697520 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:25.697791 kubelet[2537]: I0213 15:57:25.697530 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:25.697791 kubelet[2537]: I0213 15:57:25.697540 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:25.697975 kubelet[2537]: I0213 15:57:25.697560 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3b62cf660082c334f4e64456b501db20-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"3b62cf660082c334f4e64456b501db20\") " pod="kube-system/kube-apiserver-localhost" Feb 13 15:57:25.697975 kubelet[2537]: I0213 15:57:25.697569 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3b62cf660082c334f4e64456b501db20-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"3b62cf660082c334f4e64456b501db20\") " pod="kube-system/kube-apiserver-localhost" Feb 13 15:57:25.697975 kubelet[2537]: I0213 15:57:25.697577 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3b62cf660082c334f4e64456b501db20-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"3b62cf660082c334f4e64456b501db20\") " pod="kube-system/kube-apiserver-localhost" Feb 13 15:57:25.697975 kubelet[2537]: I0213 15:57:25.697586 2537 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8d610d6c43052dbc8df47eb68906a982-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8d610d6c43052dbc8df47eb68906a982\") " pod="kube-system/kube-scheduler-localhost" Feb 13 15:57:25.697975 kubelet[2537]: E0213 15:57:25.697862 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="400ms" Feb 13 15:57:25.798173 kubelet[2537]: I0213 15:57:25.798154 2537 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 15:57:25.798423 kubelet[2537]: E0213 15:57:25.798408 2537 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 15:57:25.934761 containerd[1564]: time="2025-02-13T15:57:25.934735914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:3b62cf660082c334f4e64456b501db20,Namespace:kube-system,Attempt:0,}" Feb 13 15:57:25.944699 containerd[1564]: time="2025-02-13T15:57:25.944682204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:dd3721fb1a67092819e35b40473f4063,Namespace:kube-system,Attempt:0,}" Feb 13 15:57:25.944861 containerd[1564]: time="2025-02-13T15:57:25.944780108Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8d610d6c43052dbc8df47eb68906a982,Namespace:kube-system,Attempt:0,}" Feb 13 15:57:26.098843 kubelet[2537]: E0213 15:57:26.098782 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="800ms" Feb 13 15:57:26.200009 kubelet[2537]: I0213 15:57:26.199988 2537 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 15:57:26.200231 kubelet[2537]: E0213 15:57:26.200211 2537 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 15:57:26.469500 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1935368965.mount: Deactivated successfully. Feb 13 15:57:26.473527 containerd[1564]: time="2025-02-13T15:57:26.473489572Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:26.474263 containerd[1564]: time="2025-02-13T15:57:26.474243856Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:26.474814 containerd[1564]: time="2025-02-13T15:57:26.474770134Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 15:57:26.475076 containerd[1564]: time="2025-02-13T15:57:26.474947851Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 15:57:26.475076 containerd[1564]: time="2025-02-13T15:57:26.475059331Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:26.475739 containerd[1564]: time="2025-02-13T15:57:26.475724768Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 15:57:26.476770 containerd[1564]: time="2025-02-13T15:57:26.476758345Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:26.478323 containerd[1564]: time="2025-02-13T15:57:26.478310122Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 543.501844ms" Feb 13 15:57:26.478758 containerd[1564]: time="2025-02-13T15:57:26.478739743Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 15:57:26.479536 containerd[1564]: time="2025-02-13T15:57:26.479484956Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 534.757847ms" Feb 13 15:57:26.480660 containerd[1564]: time="2025-02-13T15:57:26.480635608Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 535.819852ms" Feb 13 15:57:26.551389 kubelet[2537]: W0213 15:57:26.551330 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.551389 kubelet[2537]: E0213 15:57:26.551368 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.605626 kubelet[2537]: W0213 15:57:26.605580 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.605626 kubelet[2537]: E0213 15:57:26.605605 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.606097 containerd[1564]: time="2025-02-13T15:57:26.604080810Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:26.606184 containerd[1564]: time="2025-02-13T15:57:26.606169484Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:26.606299 containerd[1564]: time="2025-02-13T15:57:26.606230472Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:26.606495 containerd[1564]: time="2025-02-13T15:57:26.606288903Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:26.608374 containerd[1564]: time="2025-02-13T15:57:26.608335995Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:26.608423 containerd[1564]: time="2025-02-13T15:57:26.608365776Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:26.608423 containerd[1564]: time="2025-02-13T15:57:26.608375689Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:26.608423 containerd[1564]: time="2025-02-13T15:57:26.608414758Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:26.616234 containerd[1564]: time="2025-02-13T15:57:26.616178504Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:26.616234 containerd[1564]: time="2025-02-13T15:57:26.616215403Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:26.616655 containerd[1564]: time="2025-02-13T15:57:26.616225854Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:26.616655 containerd[1564]: time="2025-02-13T15:57:26.616263851Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:26.632776 systemd[1]: Started cri-containerd-dcaf8d9b5fd40d9690723fde792cf461466e2062fb3b578bebec2cedaffa8a0d.scope - libcontainer container dcaf8d9b5fd40d9690723fde792cf461466e2062fb3b578bebec2cedaffa8a0d. Feb 13 15:57:26.637971 systemd[1]: Started cri-containerd-c2a41d73c967ca77657f4ca7d36fbe5e601240904df2264e5732de691114f0b3.scope - libcontainer container c2a41d73c967ca77657f4ca7d36fbe5e601240904df2264e5732de691114f0b3. Feb 13 15:57:26.641290 systemd[1]: Started cri-containerd-d14b55c9a9b2e5761eaca7df85a889ca469c39ef4b7492c3456fa2c2998d1a93.scope - libcontainer container d14b55c9a9b2e5761eaca7df85a889ca469c39ef4b7492c3456fa2c2998d1a93. Feb 13 15:57:26.675623 containerd[1564]: time="2025-02-13T15:57:26.675601960Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8d610d6c43052dbc8df47eb68906a982,Namespace:kube-system,Attempt:0,} returns sandbox id \"dcaf8d9b5fd40d9690723fde792cf461466e2062fb3b578bebec2cedaffa8a0d\"" Feb 13 15:57:26.695507 containerd[1564]: time="2025-02-13T15:57:26.695009766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:3b62cf660082c334f4e64456b501db20,Namespace:kube-system,Attempt:0,} returns sandbox id \"d14b55c9a9b2e5761eaca7df85a889ca469c39ef4b7492c3456fa2c2998d1a93\"" Feb 13 15:57:26.698357 containerd[1564]: time="2025-02-13T15:57:26.698344312Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:dd3721fb1a67092819e35b40473f4063,Namespace:kube-system,Attempt:0,} returns sandbox id \"c2a41d73c967ca77657f4ca7d36fbe5e601240904df2264e5732de691114f0b3\"" Feb 13 15:57:26.709154 containerd[1564]: time="2025-02-13T15:57:26.709140551Z" level=info msg="CreateContainer within sandbox \"c2a41d73c967ca77657f4ca7d36fbe5e601240904df2264e5732de691114f0b3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 15:57:26.709310 containerd[1564]: time="2025-02-13T15:57:26.709299679Z" level=info msg="CreateContainer within sandbox \"d14b55c9a9b2e5761eaca7df85a889ca469c39ef4b7492c3456fa2c2998d1a93\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 15:57:26.709416 containerd[1564]: time="2025-02-13T15:57:26.709407397Z" level=info msg="CreateContainer within sandbox \"dcaf8d9b5fd40d9690723fde792cf461466e2062fb3b578bebec2cedaffa8a0d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 15:57:26.806122 kubelet[2537]: W0213 15:57:26.806089 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.806122 kubelet[2537]: E0213 15:57:26.806124 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.106:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.897890 kubelet[2537]: W0213 15:57:26.897853 2537 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.898024 kubelet[2537]: E0213 15:57:26.898009 2537 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.106:6443: connect: connection refused Feb 13 15:57:26.901384 kubelet[2537]: E0213 15:57:26.901357 2537 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.106:6443: connect: connection refused" interval="1.6s" Feb 13 15:57:26.906527 containerd[1564]: time="2025-02-13T15:57:26.906474107Z" level=info msg="CreateContainer within sandbox \"d14b55c9a9b2e5761eaca7df85a889ca469c39ef4b7492c3456fa2c2998d1a93\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"87b50c98059c515a506755a6a86937ea1a41d2dd6ae1d96f6a29a0c0703cc4b2\"" Feb 13 15:57:26.907050 containerd[1564]: time="2025-02-13T15:57:26.907012208Z" level=info msg="StartContainer for \"87b50c98059c515a506755a6a86937ea1a41d2dd6ae1d96f6a29a0c0703cc4b2\"" Feb 13 15:57:26.908249 containerd[1564]: time="2025-02-13T15:57:26.908231569Z" level=info msg="CreateContainer within sandbox \"c2a41d73c967ca77657f4ca7d36fbe5e601240904df2264e5732de691114f0b3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f6b2aa3df43dc7b587459f9de81b5907fba007fc808215ff10d45bd39603724d\"" Feb 13 15:57:26.908986 containerd[1564]: time="2025-02-13T15:57:26.908837442Z" level=info msg="StartContainer for \"f6b2aa3df43dc7b587459f9de81b5907fba007fc808215ff10d45bd39603724d\"" Feb 13 15:57:26.909922 containerd[1564]: time="2025-02-13T15:57:26.909906929Z" level=info msg="CreateContainer within sandbox \"dcaf8d9b5fd40d9690723fde792cf461466e2062fb3b578bebec2cedaffa8a0d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"236f5fa7cac6e63654696ed76964188ccc50125485cd0424b56c4c23665a6ac9\"" Feb 13 15:57:26.910364 containerd[1564]: time="2025-02-13T15:57:26.910243581Z" level=info msg="StartContainer for \"236f5fa7cac6e63654696ed76964188ccc50125485cd0424b56c4c23665a6ac9\"" Feb 13 15:57:26.933838 systemd[1]: Started cri-containerd-87b50c98059c515a506755a6a86937ea1a41d2dd6ae1d96f6a29a0c0703cc4b2.scope - libcontainer container 87b50c98059c515a506755a6a86937ea1a41d2dd6ae1d96f6a29a0c0703cc4b2. Feb 13 15:57:26.940707 systemd[1]: Started cri-containerd-236f5fa7cac6e63654696ed76964188ccc50125485cd0424b56c4c23665a6ac9.scope - libcontainer container 236f5fa7cac6e63654696ed76964188ccc50125485cd0424b56c4c23665a6ac9. Feb 13 15:57:26.941945 systemd[1]: Started cri-containerd-f6b2aa3df43dc7b587459f9de81b5907fba007fc808215ff10d45bd39603724d.scope - libcontainer container f6b2aa3df43dc7b587459f9de81b5907fba007fc808215ff10d45bd39603724d. Feb 13 15:57:26.980978 containerd[1564]: time="2025-02-13T15:57:26.980544924Z" level=info msg="StartContainer for \"236f5fa7cac6e63654696ed76964188ccc50125485cd0424b56c4c23665a6ac9\" returns successfully" Feb 13 15:57:26.985374 containerd[1564]: time="2025-02-13T15:57:26.985355184Z" level=info msg="StartContainer for \"87b50c98059c515a506755a6a86937ea1a41d2dd6ae1d96f6a29a0c0703cc4b2\" returns successfully" Feb 13 15:57:26.997453 containerd[1564]: time="2025-02-13T15:57:26.997430520Z" level=info msg="StartContainer for \"f6b2aa3df43dc7b587459f9de81b5907fba007fc808215ff10d45bd39603724d\" returns successfully" Feb 13 15:57:27.002389 kubelet[2537]: I0213 15:57:27.002375 2537 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 15:57:27.003110 kubelet[2537]: E0213 15:57:27.003093 2537 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.106:6443/api/v1/nodes\": dial tcp 139.178.70.106:6443: connect: connection refused" node="localhost" Feb 13 15:57:28.502913 kubelet[2537]: E0213 15:57:28.502887 2537 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Feb 13 15:57:28.604274 kubelet[2537]: I0213 15:57:28.604220 2537 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 15:57:28.613659 kubelet[2537]: I0213 15:57:28.613573 2537 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Feb 13 15:57:28.621292 kubelet[2537]: E0213 15:57:28.621258 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:28.721608 kubelet[2537]: E0213 15:57:28.721582 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:28.822222 kubelet[2537]: E0213 15:57:28.822006 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:28.922949 kubelet[2537]: E0213 15:57:28.922920 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:29.023936 kubelet[2537]: E0213 15:57:29.023902 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:29.124534 kubelet[2537]: E0213 15:57:29.124428 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:29.225029 kubelet[2537]: E0213 15:57:29.224993 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:29.325843 kubelet[2537]: E0213 15:57:29.325814 2537 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 15:57:29.444327 kubelet[2537]: I0213 15:57:29.444253 2537 apiserver.go:52] "Watching apiserver" Feb 13 15:57:29.495532 kubelet[2537]: I0213 15:57:29.495506 2537 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 15:57:30.146084 systemd[1]: Reload requested from client PID 2809 ('systemctl') (unit session-9.scope)... Feb 13 15:57:30.146099 systemd[1]: Reloading... Feb 13 15:57:30.218607 zram_generator::config[2857]: No configuration found. Feb 13 15:57:30.279191 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 15:57:30.296690 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 15:57:30.369165 systemd[1]: Reloading finished in 222 ms. Feb 13 15:57:30.384068 kubelet[2537]: E0213 15:57:30.383967 2537 event.go:319] "Unable to write event (broadcaster is shut down)" event="&Event{ObjectMeta:{localhost.1823cfb239afddd3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-02-13 15:57:25.474098643 +0000 UTC m=+0.346582228,LastTimestamp:2025-02-13 15:57:25.474098643 +0000 UTC m=+0.346582228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Feb 13 15:57:30.384098 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:30.398171 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 15:57:30.398355 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:30.398390 systemd[1]: kubelet.service: Consumed 429ms CPU time, 108.9M memory peak. Feb 13 15:57:30.401735 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 15:57:30.858075 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 15:57:30.863036 (kubelet)[2920]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 15:57:30.948868 kubelet[2920]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 15:57:30.950182 kubelet[2920]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 15:57:30.950182 kubelet[2920]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 15:57:30.950182 kubelet[2920]: I0213 15:57:30.949142 2920 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 15:57:30.952094 kubelet[2920]: I0213 15:57:30.952083 2920 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Feb 13 15:57:30.952155 kubelet[2920]: I0213 15:57:30.952142 2920 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 15:57:30.952296 kubelet[2920]: I0213 15:57:30.952288 2920 server.go:927] "Client rotation is on, will bootstrap in background" Feb 13 15:57:30.953046 kubelet[2920]: I0213 15:57:30.953038 2920 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 15:57:30.953977 kubelet[2920]: I0213 15:57:30.953845 2920 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 15:57:30.962697 kubelet[2920]: I0213 15:57:30.962680 2920 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 15:57:30.962808 kubelet[2920]: I0213 15:57:30.962783 2920 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 15:57:30.962902 kubelet[2920]: I0213 15:57:30.962809 2920 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Feb 13 15:57:30.962959 kubelet[2920]: I0213 15:57:30.962908 2920 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 15:57:30.962959 kubelet[2920]: I0213 15:57:30.962915 2920 container_manager_linux.go:301] "Creating device plugin manager" Feb 13 15:57:30.962959 kubelet[2920]: I0213 15:57:30.962937 2920 state_mem.go:36] "Initialized new in-memory state store" Feb 13 15:57:30.963503 kubelet[2920]: I0213 15:57:30.963489 2920 kubelet.go:400] "Attempting to sync node with API server" Feb 13 15:57:30.963503 kubelet[2920]: I0213 15:57:30.963504 2920 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 15:57:30.963573 kubelet[2920]: I0213 15:57:30.963518 2920 kubelet.go:312] "Adding apiserver pod source" Feb 13 15:57:30.963573 kubelet[2920]: I0213 15:57:30.963525 2920 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 15:57:30.971327 kubelet[2920]: I0213 15:57:30.970403 2920 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 15:57:30.971327 kubelet[2920]: I0213 15:57:30.970510 2920 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 15:57:30.971746 kubelet[2920]: I0213 15:57:30.971735 2920 server.go:1264] "Started kubelet" Feb 13 15:57:30.973413 kubelet[2920]: I0213 15:57:30.972632 2920 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 15:57:30.976523 kubelet[2920]: I0213 15:57:30.976501 2920 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 15:57:30.976981 kubelet[2920]: I0213 15:57:30.976969 2920 volume_manager.go:291] "Starting Kubelet Volume Manager" Feb 13 15:57:30.977253 kubelet[2920]: I0213 15:57:30.977245 2920 server.go:455] "Adding debug handlers to kubelet server" Feb 13 15:57:30.978333 kubelet[2920]: I0213 15:57:30.978325 2920 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 15:57:30.978641 kubelet[2920]: I0213 15:57:30.978635 2920 reconciler.go:26] "Reconciler: start to sync state" Feb 13 15:57:30.979506 kubelet[2920]: I0213 15:57:30.978969 2920 factory.go:221] Registration of the systemd container factory successfully Feb 13 15:57:30.979506 kubelet[2920]: I0213 15:57:30.979017 2920 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 15:57:30.979780 kubelet[2920]: I0213 15:57:30.978495 2920 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 15:57:30.980142 kubelet[2920]: I0213 15:57:30.979944 2920 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 15:57:30.980217 kubelet[2920]: I0213 15:57:30.980207 2920 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 15:57:30.981563 kubelet[2920]: I0213 15:57:30.981523 2920 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 15:57:30.981908 kubelet[2920]: I0213 15:57:30.981616 2920 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 15:57:30.981908 kubelet[2920]: I0213 15:57:30.981629 2920 kubelet.go:2337] "Starting kubelet main sync loop" Feb 13 15:57:30.981908 kubelet[2920]: E0213 15:57:30.981650 2920 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 15:57:30.983033 kubelet[2920]: I0213 15:57:30.983024 2920 factory.go:221] Registration of the containerd container factory successfully Feb 13 15:57:31.020736 kubelet[2920]: I0213 15:57:31.020717 2920 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 15:57:31.020881 kubelet[2920]: I0213 15:57:31.020874 2920 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 15:57:31.020995 kubelet[2920]: I0213 15:57:31.020915 2920 state_mem.go:36] "Initialized new in-memory state store" Feb 13 15:57:31.021060 kubelet[2920]: I0213 15:57:31.021052 2920 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 15:57:31.021111 kubelet[2920]: I0213 15:57:31.021090 2920 state_mem.go:96] "Updated CPUSet assignments" assignments={} Feb 13 15:57:31.021142 kubelet[2920]: I0213 15:57:31.021139 2920 policy_none.go:49] "None policy: Start" Feb 13 15:57:31.021565 kubelet[2920]: I0213 15:57:31.021522 2920 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 15:57:31.021565 kubelet[2920]: I0213 15:57:31.021534 2920 state_mem.go:35] "Initializing new in-memory state store" Feb 13 15:57:31.021784 kubelet[2920]: I0213 15:57:31.021721 2920 state_mem.go:75] "Updated machine memory state" Feb 13 15:57:31.024680 kubelet[2920]: I0213 15:57:31.024671 2920 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 15:57:31.024928 kubelet[2920]: I0213 15:57:31.024912 2920 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 15:57:31.025706 kubelet[2920]: I0213 15:57:31.025694 2920 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 15:57:31.078201 kubelet[2920]: I0213 15:57:31.078178 2920 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 15:57:31.081860 kubelet[2920]: I0213 15:57:31.081786 2920 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Feb 13 15:57:31.081860 kubelet[2920]: I0213 15:57:31.081835 2920 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Feb 13 15:57:31.082409 kubelet[2920]: I0213 15:57:31.081795 2920 topology_manager.go:215] "Topology Admit Handler" podUID="3b62cf660082c334f4e64456b501db20" podNamespace="kube-system" podName="kube-apiserver-localhost" Feb 13 15:57:31.084216 kubelet[2920]: I0213 15:57:31.083493 2920 topology_manager.go:215] "Topology Admit Handler" podUID="dd3721fb1a67092819e35b40473f4063" podNamespace="kube-system" podName="kube-controller-manager-localhost" Feb 13 15:57:31.084588 kubelet[2920]: I0213 15:57:31.084526 2920 topology_manager.go:215] "Topology Admit Handler" podUID="8d610d6c43052dbc8df47eb68906a982" podNamespace="kube-system" podName="kube-scheduler-localhost" Feb 13 15:57:31.181765 kubelet[2920]: I0213 15:57:31.181675 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3b62cf660082c334f4e64456b501db20-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"3b62cf660082c334f4e64456b501db20\") " pod="kube-system/kube-apiserver-localhost" Feb 13 15:57:31.181765 kubelet[2920]: I0213 15:57:31.181701 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3b62cf660082c334f4e64456b501db20-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"3b62cf660082c334f4e64456b501db20\") " pod="kube-system/kube-apiserver-localhost" Feb 13 15:57:31.182508 kubelet[2920]: I0213 15:57:31.182440 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3b62cf660082c334f4e64456b501db20-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"3b62cf660082c334f4e64456b501db20\") " pod="kube-system/kube-apiserver-localhost" Feb 13 15:57:31.182508 kubelet[2920]: I0213 15:57:31.182460 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:31.182508 kubelet[2920]: I0213 15:57:31.182471 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:31.182508 kubelet[2920]: I0213 15:57:31.182482 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8d610d6c43052dbc8df47eb68906a982-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8d610d6c43052dbc8df47eb68906a982\") " pod="kube-system/kube-scheduler-localhost" Feb 13 15:57:31.182508 kubelet[2920]: I0213 15:57:31.182490 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:31.182650 kubelet[2920]: I0213 15:57:31.182498 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:31.182650 kubelet[2920]: I0213 15:57:31.182508 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 15:57:31.966673 kubelet[2920]: I0213 15:57:31.966646 2920 apiserver.go:52] "Watching apiserver" Feb 13 15:57:31.979130 kubelet[2920]: I0213 15:57:31.979108 2920 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 15:57:32.066491 kubelet[2920]: I0213 15:57:32.066392 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.066380065 podStartE2EDuration="1.066380065s" podCreationTimestamp="2025-02-13 15:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 15:57:32.041068643 +0000 UTC m=+1.125768619" watchObservedRunningTime="2025-02-13 15:57:32.066380065 +0000 UTC m=+1.151080048" Feb 13 15:57:32.097144 kubelet[2920]: I0213 15:57:32.096802 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.096793055 podStartE2EDuration="1.096793055s" podCreationTimestamp="2025-02-13 15:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 15:57:32.067479438 +0000 UTC m=+1.152179420" watchObservedRunningTime="2025-02-13 15:57:32.096793055 +0000 UTC m=+1.181493029" Feb 13 15:57:32.097144 kubelet[2920]: I0213 15:57:32.096865 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.0968617840000001 podStartE2EDuration="1.096861784s" podCreationTimestamp="2025-02-13 15:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 15:57:32.089719131 +0000 UTC m=+1.174419114" watchObservedRunningTime="2025-02-13 15:57:32.096861784 +0000 UTC m=+1.181561766" Feb 13 15:57:34.590792 systemd[1]: Started sshd@12-139.178.70.106:22-51.178.43.161:57360.service - OpenSSH per-connection server daemon (51.178.43.161:57360). Feb 13 15:57:34.869167 sudo[1894]: pam_unix(sudo:session): session closed for user root Feb 13 15:57:34.869834 sshd[1893]: Connection closed by 147.75.109.163 port 47840 Feb 13 15:57:34.870493 sshd-session[1890]: pam_unix(sshd:session): session closed for user core Feb 13 15:57:34.872621 systemd[1]: sshd@9-139.178.70.106:22-147.75.109.163:47840.service: Deactivated successfully. Feb 13 15:57:34.873852 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 15:57:34.874167 systemd[1]: session-9.scope: Consumed 3.023s CPU time, 192.8M memory peak. Feb 13 15:57:34.875449 systemd-logind[1544]: Session 9 logged out. Waiting for processes to exit. Feb 13 15:57:34.876177 systemd-logind[1544]: Removed session 9. Feb 13 15:57:35.436207 sshd[2987]: Invalid user wilma from 51.178.43.161 port 57360 Feb 13 15:57:35.602204 sshd[2987]: Received disconnect from 51.178.43.161 port 57360:11: Bye Bye [preauth] Feb 13 15:57:35.602204 sshd[2987]: Disconnected from invalid user wilma 51.178.43.161 port 57360 [preauth] Feb 13 15:57:35.603598 systemd[1]: sshd@12-139.178.70.106:22-51.178.43.161:57360.service: Deactivated successfully. Feb 13 15:57:45.613457 kubelet[2920]: I0213 15:57:45.613412 2920 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 15:57:45.617820 containerd[1564]: time="2025-02-13T15:57:45.617793287Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 15:57:45.618429 kubelet[2920]: I0213 15:57:45.617916 2920 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 15:57:46.482922 kubelet[2920]: I0213 15:57:46.482889 2920 topology_manager.go:215] "Topology Admit Handler" podUID="eb1badf3-32a7-4b82-9cac-c3284113c2c4" podNamespace="kube-system" podName="kube-proxy-gpp54" Feb 13 15:57:46.496054 systemd[1]: Created slice kubepods-besteffort-podeb1badf3_32a7_4b82_9cac_c3284113c2c4.slice - libcontainer container kubepods-besteffort-podeb1badf3_32a7_4b82_9cac_c3284113c2c4.slice. Feb 13 15:57:46.579249 kubelet[2920]: I0213 15:57:46.579224 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/eb1badf3-32a7-4b82-9cac-c3284113c2c4-kube-proxy\") pod \"kube-proxy-gpp54\" (UID: \"eb1badf3-32a7-4b82-9cac-c3284113c2c4\") " pod="kube-system/kube-proxy-gpp54" Feb 13 15:57:46.579459 kubelet[2920]: I0213 15:57:46.579381 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/eb1badf3-32a7-4b82-9cac-c3284113c2c4-xtables-lock\") pod \"kube-proxy-gpp54\" (UID: \"eb1badf3-32a7-4b82-9cac-c3284113c2c4\") " pod="kube-system/kube-proxy-gpp54" Feb 13 15:57:46.579459 kubelet[2920]: I0213 15:57:46.579401 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb1badf3-32a7-4b82-9cac-c3284113c2c4-lib-modules\") pod \"kube-proxy-gpp54\" (UID: \"eb1badf3-32a7-4b82-9cac-c3284113c2c4\") " pod="kube-system/kube-proxy-gpp54" Feb 13 15:57:46.579459 kubelet[2920]: I0213 15:57:46.579418 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk6hc\" (UniqueName: \"kubernetes.io/projected/eb1badf3-32a7-4b82-9cac-c3284113c2c4-kube-api-access-rk6hc\") pod \"kube-proxy-gpp54\" (UID: \"eb1badf3-32a7-4b82-9cac-c3284113c2c4\") " pod="kube-system/kube-proxy-gpp54" Feb 13 15:57:46.595910 kubelet[2920]: I0213 15:57:46.595877 2920 topology_manager.go:215] "Topology Admit Handler" podUID="7be387ad-2fca-4058-9980-ed67596432a7" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-kd8jd" Feb 13 15:57:46.603245 systemd[1]: Created slice kubepods-besteffort-pod7be387ad_2fca_4058_9980_ed67596432a7.slice - libcontainer container kubepods-besteffort-pod7be387ad_2fca_4058_9980_ed67596432a7.slice. Feb 13 15:57:46.679932 kubelet[2920]: I0213 15:57:46.679900 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7be387ad-2fca-4058-9980-ed67596432a7-var-lib-calico\") pod \"tigera-operator-7bc55997bb-kd8jd\" (UID: \"7be387ad-2fca-4058-9980-ed67596432a7\") " pod="tigera-operator/tigera-operator-7bc55997bb-kd8jd" Feb 13 15:57:46.681084 kubelet[2920]: I0213 15:57:46.679941 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trvdr\" (UniqueName: \"kubernetes.io/projected/7be387ad-2fca-4058-9980-ed67596432a7-kube-api-access-trvdr\") pod \"tigera-operator-7bc55997bb-kd8jd\" (UID: \"7be387ad-2fca-4058-9980-ed67596432a7\") " pod="tigera-operator/tigera-operator-7bc55997bb-kd8jd" Feb 13 15:57:46.805976 containerd[1564]: time="2025-02-13T15:57:46.805946226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gpp54,Uid:eb1badf3-32a7-4b82-9cac-c3284113c2c4,Namespace:kube-system,Attempt:0,}" Feb 13 15:57:46.819856 containerd[1564]: time="2025-02-13T15:57:46.819804447Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:46.819967 containerd[1564]: time="2025-02-13T15:57:46.819833984Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:46.819967 containerd[1564]: time="2025-02-13T15:57:46.819858644Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:46.820026 containerd[1564]: time="2025-02-13T15:57:46.819902524Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:46.838770 systemd[1]: Started cri-containerd-806fb11fd8d340bf367b880f3526d584224e2a261ba171968d9c8740ce7b73fe.scope - libcontainer container 806fb11fd8d340bf367b880f3526d584224e2a261ba171968d9c8740ce7b73fe. Feb 13 15:57:46.852633 containerd[1564]: time="2025-02-13T15:57:46.852471762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gpp54,Uid:eb1badf3-32a7-4b82-9cac-c3284113c2c4,Namespace:kube-system,Attempt:0,} returns sandbox id \"806fb11fd8d340bf367b880f3526d584224e2a261ba171968d9c8740ce7b73fe\"" Feb 13 15:57:46.858075 containerd[1564]: time="2025-02-13T15:57:46.858055955Z" level=info msg="CreateContainer within sandbox \"806fb11fd8d340bf367b880f3526d584224e2a261ba171968d9c8740ce7b73fe\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 15:57:46.902332 containerd[1564]: time="2025-02-13T15:57:46.902269295Z" level=info msg="CreateContainer within sandbox \"806fb11fd8d340bf367b880f3526d584224e2a261ba171968d9c8740ce7b73fe\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"278ec1162ab1383df4605b2282127a3fd84de5c9b3b1dd4c5a2bbe98d38c3b84\"" Feb 13 15:57:46.902931 containerd[1564]: time="2025-02-13T15:57:46.902825615Z" level=info msg="StartContainer for \"278ec1162ab1383df4605b2282127a3fd84de5c9b3b1dd4c5a2bbe98d38c3b84\"" Feb 13 15:57:46.907458 containerd[1564]: time="2025-02-13T15:57:46.907421663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-kd8jd,Uid:7be387ad-2fca-4058-9980-ed67596432a7,Namespace:tigera-operator,Attempt:0,}" Feb 13 15:57:46.926679 systemd[1]: Started cri-containerd-278ec1162ab1383df4605b2282127a3fd84de5c9b3b1dd4c5a2bbe98d38c3b84.scope - libcontainer container 278ec1162ab1383df4605b2282127a3fd84de5c9b3b1dd4c5a2bbe98d38c3b84. Feb 13 15:57:46.930023 containerd[1564]: time="2025-02-13T15:57:46.929964474Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:46.930023 containerd[1564]: time="2025-02-13T15:57:46.929998118Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:46.930023 containerd[1564]: time="2025-02-13T15:57:46.930004994Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:46.930267 containerd[1564]: time="2025-02-13T15:57:46.930191443Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:46.946669 systemd[1]: Started cri-containerd-3e0e78569b5a7e7ad7e49c117da0416238ebff9868a5c719ed28dafd75546f76.scope - libcontainer container 3e0e78569b5a7e7ad7e49c117da0416238ebff9868a5c719ed28dafd75546f76. Feb 13 15:57:46.960517 containerd[1564]: time="2025-02-13T15:57:46.960494573Z" level=info msg="StartContainer for \"278ec1162ab1383df4605b2282127a3fd84de5c9b3b1dd4c5a2bbe98d38c3b84\" returns successfully" Feb 13 15:57:46.979698 containerd[1564]: time="2025-02-13T15:57:46.979670435Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-kd8jd,Uid:7be387ad-2fca-4058-9980-ed67596432a7,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3e0e78569b5a7e7ad7e49c117da0416238ebff9868a5c719ed28dafd75546f76\"" Feb 13 15:57:46.981407 containerd[1564]: time="2025-02-13T15:57:46.981390427Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Feb 13 15:57:48.441895 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2388023677.mount: Deactivated successfully. Feb 13 15:57:48.899574 containerd[1564]: time="2025-02-13T15:57:48.899520826Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:48.900264 containerd[1564]: time="2025-02-13T15:57:48.900232964Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Feb 13 15:57:48.900594 containerd[1564]: time="2025-02-13T15:57:48.900579257Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:48.902619 containerd[1564]: time="2025-02-13T15:57:48.902092869Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:48.905135 containerd[1564]: time="2025-02-13T15:57:48.904315227Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 1.922901445s" Feb 13 15:57:48.905135 containerd[1564]: time="2025-02-13T15:57:48.904332375Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Feb 13 15:57:48.915442 containerd[1564]: time="2025-02-13T15:57:48.915424577Z" level=info msg="CreateContainer within sandbox \"3e0e78569b5a7e7ad7e49c117da0416238ebff9868a5c719ed28dafd75546f76\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 15:57:48.920349 containerd[1564]: time="2025-02-13T15:57:48.920023394Z" level=info msg="CreateContainer within sandbox \"3e0e78569b5a7e7ad7e49c117da0416238ebff9868a5c719ed28dafd75546f76\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"0b2cf263c8a64d34873d506badc8e76e1c0b221df961bb7dee41897153fbe273\"" Feb 13 15:57:48.921577 containerd[1564]: time="2025-02-13T15:57:48.921355857Z" level=info msg="StartContainer for \"0b2cf263c8a64d34873d506badc8e76e1c0b221df961bb7dee41897153fbe273\"" Feb 13 15:57:48.945634 systemd[1]: Started cri-containerd-0b2cf263c8a64d34873d506badc8e76e1c0b221df961bb7dee41897153fbe273.scope - libcontainer container 0b2cf263c8a64d34873d506badc8e76e1c0b221df961bb7dee41897153fbe273. Feb 13 15:57:48.960400 containerd[1564]: time="2025-02-13T15:57:48.960374201Z" level=info msg="StartContainer for \"0b2cf263c8a64d34873d506badc8e76e1c0b221df961bb7dee41897153fbe273\" returns successfully" Feb 13 15:57:49.047609 kubelet[2920]: I0213 15:57:49.045387 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gpp54" podStartSLOduration=3.043976687 podStartE2EDuration="3.043976687s" podCreationTimestamp="2025-02-13 15:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 15:57:47.036844729 +0000 UTC m=+16.121544711" watchObservedRunningTime="2025-02-13 15:57:49.043976687 +0000 UTC m=+18.128676662" Feb 13 15:57:49.047891 kubelet[2920]: I0213 15:57:49.047656 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-kd8jd" podStartSLOduration=1.123170647 podStartE2EDuration="3.047640887s" podCreationTimestamp="2025-02-13 15:57:46 +0000 UTC" firstStartedPulling="2025-02-13 15:57:46.980279333 +0000 UTC m=+16.064979308" lastFinishedPulling="2025-02-13 15:57:48.904749575 +0000 UTC m=+17.989449548" observedRunningTime="2025-02-13 15:57:49.043408901 +0000 UTC m=+18.128108883" watchObservedRunningTime="2025-02-13 15:57:49.047640887 +0000 UTC m=+18.132340870" Feb 13 15:57:51.763487 kubelet[2920]: I0213 15:57:51.763458 2920 topology_manager.go:215] "Topology Admit Handler" podUID="be8f398f-5d24-4965-97d0-de1342209263" podNamespace="calico-system" podName="calico-typha-54dd46576b-zpzpr" Feb 13 15:57:51.768812 kubelet[2920]: W0213 15:57:51.768658 2920 reflector.go:547] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 15:57:51.768812 kubelet[2920]: E0213 15:57:51.768742 2920 reflector.go:150] object-"calico-system"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 15:57:51.770641 kubelet[2920]: W0213 15:57:51.769579 2920 reflector.go:547] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 15:57:51.770641 kubelet[2920]: E0213 15:57:51.769598 2920 reflector.go:150] object-"calico-system"/"tigera-ca-bundle": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 15:57:51.773760 systemd[1]: Created slice kubepods-besteffort-podbe8f398f_5d24_4965_97d0_de1342209263.slice - libcontainer container kubepods-besteffort-podbe8f398f_5d24_4965_97d0_de1342209263.slice. Feb 13 15:57:51.822825 kubelet[2920]: I0213 15:57:51.821972 2920 topology_manager.go:215] "Topology Admit Handler" podUID="5217f836-48d0-4993-bcf9-2ed7a837a6e6" podNamespace="calico-system" podName="calico-node-pwzpk" Feb 13 15:57:51.828438 systemd[1]: Created slice kubepods-besteffort-pod5217f836_48d0_4993_bcf9_2ed7a837a6e6.slice - libcontainer container kubepods-besteffort-pod5217f836_48d0_4993_bcf9_2ed7a837a6e6.slice. Feb 13 15:57:51.926195 kubelet[2920]: I0213 15:57:51.926173 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvj2z\" (UniqueName: \"kubernetes.io/projected/be8f398f-5d24-4965-97d0-de1342209263-kube-api-access-vvj2z\") pod \"calico-typha-54dd46576b-zpzpr\" (UID: \"be8f398f-5d24-4965-97d0-de1342209263\") " pod="calico-system/calico-typha-54dd46576b-zpzpr" Feb 13 15:57:51.926315 kubelet[2920]: I0213 15:57:51.926308 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-lib-modules\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.926384 kubelet[2920]: I0213 15:57:51.926375 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/be8f398f-5d24-4965-97d0-de1342209263-tigera-ca-bundle\") pod \"calico-typha-54dd46576b-zpzpr\" (UID: \"be8f398f-5d24-4965-97d0-de1342209263\") " pod="calico-system/calico-typha-54dd46576b-zpzpr" Feb 13 15:57:51.926442 kubelet[2920]: I0213 15:57:51.926436 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/be8f398f-5d24-4965-97d0-de1342209263-typha-certs\") pod \"calico-typha-54dd46576b-zpzpr\" (UID: \"be8f398f-5d24-4965-97d0-de1342209263\") " pod="calico-system/calico-typha-54dd46576b-zpzpr" Feb 13 15:57:51.926530 kubelet[2920]: I0213 15:57:51.926522 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-policysync\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.926676 kubelet[2920]: I0213 15:57:51.926668 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5217f836-48d0-4993-bcf9-2ed7a837a6e6-tigera-ca-bundle\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.926735 kubelet[2920]: I0213 15:57:51.926725 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-cni-bin-dir\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.926832 kubelet[2920]: I0213 15:57:51.926818 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-xtables-lock\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.926935 kubelet[2920]: I0213 15:57:51.926928 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-var-run-calico\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.926994 kubelet[2920]: I0213 15:57:51.926986 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-cni-net-dir\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.927131 kubelet[2920]: I0213 15:57:51.927032 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqsz\" (UniqueName: \"kubernetes.io/projected/5217f836-48d0-4993-bcf9-2ed7a837a6e6-kube-api-access-4jqsz\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.927131 kubelet[2920]: I0213 15:57:51.927064 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/5217f836-48d0-4993-bcf9-2ed7a837a6e6-node-certs\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.927131 kubelet[2920]: I0213 15:57:51.927076 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-cni-log-dir\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.927131 kubelet[2920]: I0213 15:57:51.927086 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-var-lib-calico\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.927131 kubelet[2920]: I0213 15:57:51.927096 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/5217f836-48d0-4993-bcf9-2ed7a837a6e6-flexvol-driver-host\") pod \"calico-node-pwzpk\" (UID: \"5217f836-48d0-4993-bcf9-2ed7a837a6e6\") " pod="calico-system/calico-node-pwzpk" Feb 13 15:57:51.929978 kubelet[2920]: I0213 15:57:51.929086 2920 topology_manager.go:215] "Topology Admit Handler" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" podNamespace="calico-system" podName="csi-node-driver-fz6cz" Feb 13 15:57:51.929978 kubelet[2920]: E0213 15:57:51.929276 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:57:52.027325 kubelet[2920]: I0213 15:57:52.027254 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7wc6\" (UniqueName: \"kubernetes.io/projected/b126212a-e016-4060-9fc3-97a9a5142c06-kube-api-access-d7wc6\") pod \"csi-node-driver-fz6cz\" (UID: \"b126212a-e016-4060-9fc3-97a9a5142c06\") " pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:57:52.027325 kubelet[2920]: I0213 15:57:52.027289 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b126212a-e016-4060-9fc3-97a9a5142c06-varrun\") pod \"csi-node-driver-fz6cz\" (UID: \"b126212a-e016-4060-9fc3-97a9a5142c06\") " pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:57:52.027325 kubelet[2920]: I0213 15:57:52.027310 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b126212a-e016-4060-9fc3-97a9a5142c06-registration-dir\") pod \"csi-node-driver-fz6cz\" (UID: \"b126212a-e016-4060-9fc3-97a9a5142c06\") " pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:57:52.027436 kubelet[2920]: I0213 15:57:52.027330 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b126212a-e016-4060-9fc3-97a9a5142c06-socket-dir\") pod \"csi-node-driver-fz6cz\" (UID: \"b126212a-e016-4060-9fc3-97a9a5142c06\") " pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:57:52.027436 kubelet[2920]: I0213 15:57:52.027365 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b126212a-e016-4060-9fc3-97a9a5142c06-kubelet-dir\") pod \"csi-node-driver-fz6cz\" (UID: \"b126212a-e016-4060-9fc3-97a9a5142c06\") " pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:57:52.032490 kubelet[2920]: E0213 15:57:52.032348 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.032490 kubelet[2920]: W0213 15:57:52.032366 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.032770 kubelet[2920]: E0213 15:57:52.032759 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.032818 kubelet[2920]: W0213 15:57:52.032804 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.038179 kubelet[2920]: E0213 15:57:52.036527 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.038179 kubelet[2920]: E0213 15:57:52.038004 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.038997 kubelet[2920]: E0213 15:57:52.038987 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.039078 kubelet[2920]: W0213 15:57:52.039070 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.039124 kubelet[2920]: E0213 15:57:52.039117 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.039677 kubelet[2920]: E0213 15:57:52.039669 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.039894 kubelet[2920]: W0213 15:57:52.039709 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.039894 kubelet[2920]: E0213 15:57:52.039718 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.041213 kubelet[2920]: E0213 15:57:52.041153 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.041213 kubelet[2920]: W0213 15:57:52.041162 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.041213 kubelet[2920]: E0213 15:57:52.041184 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.041380 kubelet[2920]: E0213 15:57:52.041264 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.041380 kubelet[2920]: W0213 15:57:52.041269 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.041380 kubelet[2920]: E0213 15:57:52.041285 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.041505 kubelet[2920]: E0213 15:57:52.041455 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.041505 kubelet[2920]: W0213 15:57:52.041462 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.041505 kubelet[2920]: E0213 15:57:52.041478 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.041699 kubelet[2920]: E0213 15:57:52.041636 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.041699 kubelet[2920]: W0213 15:57:52.041642 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.041699 kubelet[2920]: E0213 15:57:52.041653 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.041891 kubelet[2920]: E0213 15:57:52.041835 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.041891 kubelet[2920]: W0213 15:57:52.041841 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.041891 kubelet[2920]: E0213 15:57:52.041847 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.042093 kubelet[2920]: E0213 15:57:52.042066 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.042093 kubelet[2920]: W0213 15:57:52.042073 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.042093 kubelet[2920]: E0213 15:57:52.042079 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.128342 kubelet[2920]: E0213 15:57:52.128256 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.128342 kubelet[2920]: W0213 15:57:52.128270 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.128342 kubelet[2920]: E0213 15:57:52.128282 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.128618 kubelet[2920]: E0213 15:57:52.128533 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.128618 kubelet[2920]: W0213 15:57:52.128540 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.128618 kubelet[2920]: E0213 15:57:52.128546 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.128825 kubelet[2920]: E0213 15:57:52.128738 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.128825 kubelet[2920]: W0213 15:57:52.128744 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.128825 kubelet[2920]: E0213 15:57:52.128749 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.128946 kubelet[2920]: E0213 15:57:52.128899 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.128946 kubelet[2920]: W0213 15:57:52.128905 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.128946 kubelet[2920]: E0213 15:57:52.128910 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.129142 kubelet[2920]: E0213 15:57:52.129064 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.129142 kubelet[2920]: W0213 15:57:52.129070 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.129142 kubelet[2920]: E0213 15:57:52.129075 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.129270 kubelet[2920]: E0213 15:57:52.129220 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.129270 kubelet[2920]: W0213 15:57:52.129226 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.129270 kubelet[2920]: E0213 15:57:52.129232 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.129380 kubelet[2920]: E0213 15:57:52.129374 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.129435 kubelet[2920]: W0213 15:57:52.129408 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.129435 kubelet[2920]: E0213 15:57:52.129420 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.129527 kubelet[2920]: E0213 15:57:52.129517 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.129527 kubelet[2920]: W0213 15:57:52.129526 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.129583 kubelet[2920]: E0213 15:57:52.129535 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.129637 kubelet[2920]: E0213 15:57:52.129627 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.129637 kubelet[2920]: W0213 15:57:52.129633 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.129686 kubelet[2920]: E0213 15:57:52.129640 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.129761 kubelet[2920]: E0213 15:57:52.129751 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.129761 kubelet[2920]: W0213 15:57:52.129759 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.129805 kubelet[2920]: E0213 15:57:52.129768 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.129868 kubelet[2920]: E0213 15:57:52.129860 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.129868 kubelet[2920]: W0213 15:57:52.129867 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.129967 kubelet[2920]: E0213 15:57:52.129872 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.130043 kubelet[2920]: E0213 15:57:52.130037 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.130117 kubelet[2920]: W0213 15:57:52.130074 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.130117 kubelet[2920]: E0213 15:57:52.130083 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.130229 kubelet[2920]: E0213 15:57:52.130223 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.130304 kubelet[2920]: W0213 15:57:52.130258 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.130304 kubelet[2920]: E0213 15:57:52.130269 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.130401 kubelet[2920]: E0213 15:57:52.130396 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.130970 kubelet[2920]: W0213 15:57:52.130520 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.130970 kubelet[2920]: E0213 15:57:52.130532 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.130970 kubelet[2920]: E0213 15:57:52.130882 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.130970 kubelet[2920]: W0213 15:57:52.130888 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.130970 kubelet[2920]: E0213 15:57:52.130903 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.131542 kubelet[2920]: E0213 15:57:52.131371 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.131542 kubelet[2920]: W0213 15:57:52.131378 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.131542 kubelet[2920]: E0213 15:57:52.131396 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.131542 kubelet[2920]: E0213 15:57:52.131471 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.131542 kubelet[2920]: W0213 15:57:52.131476 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.131542 kubelet[2920]: E0213 15:57:52.131485 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.131827 kubelet[2920]: E0213 15:57:52.131749 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.131827 kubelet[2920]: W0213 15:57:52.131755 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.131827 kubelet[2920]: E0213 15:57:52.131765 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.131922 kubelet[2920]: E0213 15:57:52.131916 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.131957 kubelet[2920]: W0213 15:57:52.131952 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.131994 kubelet[2920]: E0213 15:57:52.131988 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.132132 kubelet[2920]: E0213 15:57:52.132125 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.132213 kubelet[2920]: W0213 15:57:52.132163 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.132213 kubelet[2920]: E0213 15:57:52.132175 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.132283 kubelet[2920]: E0213 15:57:52.132277 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.132311 kubelet[2920]: W0213 15:57:52.132307 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.132347 kubelet[2920]: E0213 15:57:52.132342 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.132512 kubelet[2920]: E0213 15:57:52.132506 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.132567 kubelet[2920]: W0213 15:57:52.132538 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.132684 kubelet[2920]: E0213 15:57:52.132631 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.132794 kubelet[2920]: E0213 15:57:52.132733 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.132794 kubelet[2920]: W0213 15:57:52.132739 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.132794 kubelet[2920]: E0213 15:57:52.132754 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.133745 kubelet[2920]: E0213 15:57:52.132930 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.133745 kubelet[2920]: W0213 15:57:52.132935 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.133745 kubelet[2920]: E0213 15:57:52.132944 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.133745 kubelet[2920]: E0213 15:57:52.133084 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.133745 kubelet[2920]: W0213 15:57:52.133089 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.133745 kubelet[2920]: E0213 15:57:52.133098 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.133745 kubelet[2920]: E0213 15:57:52.133638 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.133745 kubelet[2920]: W0213 15:57:52.133645 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.133745 kubelet[2920]: E0213 15:57:52.133653 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.134171 kubelet[2920]: E0213 15:57:52.133973 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.134171 kubelet[2920]: W0213 15:57:52.133980 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.134171 kubelet[2920]: E0213 15:57:52.133992 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.134171 kubelet[2920]: E0213 15:57:52.134135 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.134171 kubelet[2920]: W0213 15:57:52.134139 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.134171 kubelet[2920]: E0213 15:57:52.134147 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.134434 kubelet[2920]: E0213 15:57:52.134412 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.134434 kubelet[2920]: W0213 15:57:52.134418 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.134434 kubelet[2920]: E0213 15:57:52.134424 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.232747 kubelet[2920]: E0213 15:57:52.232729 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.232747 kubelet[2920]: W0213 15:57:52.232741 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.232747 kubelet[2920]: E0213 15:57:52.232751 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.232888 kubelet[2920]: E0213 15:57:52.232859 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.232888 kubelet[2920]: W0213 15:57:52.232865 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.232888 kubelet[2920]: E0213 15:57:52.232870 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.233001 kubelet[2920]: E0213 15:57:52.232953 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.233001 kubelet[2920]: W0213 15:57:52.232958 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.233001 kubelet[2920]: E0213 15:57:52.232963 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.233068 kubelet[2920]: E0213 15:57:52.233047 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.233068 kubelet[2920]: W0213 15:57:52.233051 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.233068 kubelet[2920]: E0213 15:57:52.233055 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.233142 kubelet[2920]: E0213 15:57:52.233136 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.233142 kubelet[2920]: W0213 15:57:52.233140 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.233179 kubelet[2920]: E0213 15:57:52.233145 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.333836 kubelet[2920]: E0213 15:57:52.333777 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.334198 kubelet[2920]: W0213 15:57:52.333983 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.334198 kubelet[2920]: E0213 15:57:52.334029 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.334517 kubelet[2920]: E0213 15:57:52.334395 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.334517 kubelet[2920]: W0213 15:57:52.334401 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.334517 kubelet[2920]: E0213 15:57:52.334413 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.334939 kubelet[2920]: E0213 15:57:52.334757 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.334939 kubelet[2920]: W0213 15:57:52.334770 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.334939 kubelet[2920]: E0213 15:57:52.334776 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.335380 kubelet[2920]: E0213 15:57:52.335246 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.335380 kubelet[2920]: W0213 15:57:52.335253 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.335380 kubelet[2920]: E0213 15:57:52.335259 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.335698 kubelet[2920]: E0213 15:57:52.335624 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.335698 kubelet[2920]: W0213 15:57:52.335635 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.335698 kubelet[2920]: E0213 15:57:52.335641 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.436785 kubelet[2920]: E0213 15:57:52.436762 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.436979 kubelet[2920]: W0213 15:57:52.436891 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.436979 kubelet[2920]: E0213 15:57:52.436916 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.437059 kubelet[2920]: E0213 15:57:52.437054 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.437090 kubelet[2920]: W0213 15:57:52.437085 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.437170 kubelet[2920]: E0213 15:57:52.437122 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.437289 kubelet[2920]: E0213 15:57:52.437225 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.437289 kubelet[2920]: W0213 15:57:52.437232 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.437289 kubelet[2920]: E0213 15:57:52.437237 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.437380 kubelet[2920]: E0213 15:57:52.437374 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.437715 kubelet[2920]: W0213 15:57:52.437520 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.437715 kubelet[2920]: E0213 15:57:52.437527 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.437715 kubelet[2920]: E0213 15:57:52.437655 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.437715 kubelet[2920]: W0213 15:57:52.437660 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.437715 kubelet[2920]: E0213 15:57:52.437665 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.538367 kubelet[2920]: E0213 15:57:52.538282 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.538367 kubelet[2920]: W0213 15:57:52.538294 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.538367 kubelet[2920]: E0213 15:57:52.538305 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.538513 kubelet[2920]: E0213 15:57:52.538506 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.538579 kubelet[2920]: W0213 15:57:52.538546 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.538673 kubelet[2920]: E0213 15:57:52.538615 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.538782 kubelet[2920]: E0213 15:57:52.538725 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.538782 kubelet[2920]: W0213 15:57:52.538731 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.538782 kubelet[2920]: E0213 15:57:52.538737 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.538861 kubelet[2920]: E0213 15:57:52.538855 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.538896 kubelet[2920]: W0213 15:57:52.538890 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.538926 kubelet[2920]: E0213 15:57:52.538921 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.539059 kubelet[2920]: E0213 15:57:52.539053 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.539115 kubelet[2920]: W0213 15:57:52.539092 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.539115 kubelet[2920]: E0213 15:57:52.539099 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.640173 kubelet[2920]: E0213 15:57:52.640045 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.640173 kubelet[2920]: W0213 15:57:52.640062 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.640173 kubelet[2920]: E0213 15:57:52.640073 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.640458 kubelet[2920]: E0213 15:57:52.640411 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.640458 kubelet[2920]: W0213 15:57:52.640418 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.640458 kubelet[2920]: E0213 15:57:52.640425 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.640874 kubelet[2920]: E0213 15:57:52.640811 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.640874 kubelet[2920]: W0213 15:57:52.640823 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.640874 kubelet[2920]: E0213 15:57:52.640829 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.640966 kubelet[2920]: E0213 15:57:52.640960 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.641055 kubelet[2920]: W0213 15:57:52.640994 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.641055 kubelet[2920]: E0213 15:57:52.641002 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.641126 kubelet[2920]: E0213 15:57:52.641120 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.641162 kubelet[2920]: W0213 15:57:52.641156 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.641196 kubelet[2920]: E0213 15:57:52.641189 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.741773 kubelet[2920]: E0213 15:57:52.741754 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.741773 kubelet[2920]: W0213 15:57:52.741768 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.741773 kubelet[2920]: E0213 15:57:52.741780 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.741902 kubelet[2920]: E0213 15:57:52.741890 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.741902 kubelet[2920]: W0213 15:57:52.741897 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.741943 kubelet[2920]: E0213 15:57:52.741903 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.742013 kubelet[2920]: E0213 15:57:52.741997 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.742013 kubelet[2920]: W0213 15:57:52.742005 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.742013 kubelet[2920]: E0213 15:57:52.742010 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.742115 kubelet[2920]: E0213 15:57:52.742105 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.742115 kubelet[2920]: W0213 15:57:52.742113 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.742154 kubelet[2920]: E0213 15:57:52.742118 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.742213 kubelet[2920]: E0213 15:57:52.742202 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.742213 kubelet[2920]: W0213 15:57:52.742210 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.742247 kubelet[2920]: E0213 15:57:52.742215 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.843048 kubelet[2920]: E0213 15:57:52.842852 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.843048 kubelet[2920]: W0213 15:57:52.842865 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.843048 kubelet[2920]: E0213 15:57:52.842877 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.843048 kubelet[2920]: E0213 15:57:52.842980 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.843048 kubelet[2920]: W0213 15:57:52.842985 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.843048 kubelet[2920]: E0213 15:57:52.842994 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.843365 kubelet[2920]: E0213 15:57:52.843246 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.843365 kubelet[2920]: W0213 15:57:52.843251 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.843365 kubelet[2920]: E0213 15:57:52.843256 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.843365 kubelet[2920]: E0213 15:57:52.843346 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.843365 kubelet[2920]: W0213 15:57:52.843351 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.843365 kubelet[2920]: E0213 15:57:52.843357 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.843456 kubelet[2920]: E0213 15:57:52.843442 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.843456 kubelet[2920]: W0213 15:57:52.843446 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.843456 kubelet[2920]: E0213 15:57:52.843454 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.944283 kubelet[2920]: E0213 15:57:52.944143 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.944283 kubelet[2920]: W0213 15:57:52.944154 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.944283 kubelet[2920]: E0213 15:57:52.944166 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.944965 kubelet[2920]: E0213 15:57:52.944646 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.944965 kubelet[2920]: W0213 15:57:52.944653 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.944965 kubelet[2920]: E0213 15:57:52.944660 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.944965 kubelet[2920]: E0213 15:57:52.944866 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.944965 kubelet[2920]: W0213 15:57:52.944871 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.945108 kubelet[2920]: E0213 15:57:52.944972 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.945425 kubelet[2920]: E0213 15:57:52.945415 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.945425 kubelet[2920]: W0213 15:57:52.945422 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.945475 kubelet[2920]: E0213 15:57:52.945428 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.945522 kubelet[2920]: E0213 15:57:52.945513 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.945522 kubelet[2920]: W0213 15:57:52.945520 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.945594 kubelet[2920]: E0213 15:57:52.945525 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.983345 kubelet[2920]: E0213 15:57:52.982620 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.983345 kubelet[2920]: W0213 15:57:52.982633 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.983345 kubelet[2920]: E0213 15:57:52.982644 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.983345 kubelet[2920]: E0213 15:57:52.982892 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.983345 kubelet[2920]: W0213 15:57:52.982896 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.983345 kubelet[2920]: E0213 15:57:52.982903 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:52.988570 kubelet[2920]: E0213 15:57:52.987992 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:52.988570 kubelet[2920]: W0213 15:57:52.988004 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:52.988570 kubelet[2920]: E0213 15:57:52.988015 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.028570 kubelet[2920]: E0213 15:57:53.028522 2920 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 13 15:57:53.028675 kubelet[2920]: E0213 15:57:53.028603 2920 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5217f836-48d0-4993-bcf9-2ed7a837a6e6-tigera-ca-bundle podName:5217f836-48d0-4993-bcf9-2ed7a837a6e6 nodeName:}" failed. No retries permitted until 2025-02-13 15:57:53.528587813 +0000 UTC m=+22.613287788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/5217f836-48d0-4993-bcf9-2ed7a837a6e6-tigera-ca-bundle") pod "calico-node-pwzpk" (UID: "5217f836-48d0-4993-bcf9-2ed7a837a6e6") : failed to sync configmap cache: timed out waiting for the condition Feb 13 15:57:53.028773 kubelet[2920]: E0213 15:57:53.028522 2920 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 13 15:57:53.028773 kubelet[2920]: E0213 15:57:53.028758 2920 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/be8f398f-5d24-4965-97d0-de1342209263-tigera-ca-bundle podName:be8f398f-5d24-4965-97d0-de1342209263 nodeName:}" failed. No retries permitted until 2025-02-13 15:57:53.528749515 +0000 UTC m=+22.613449491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/be8f398f-5d24-4965-97d0-de1342209263-tigera-ca-bundle") pod "calico-typha-54dd46576b-zpzpr" (UID: "be8f398f-5d24-4965-97d0-de1342209263") : failed to sync configmap cache: timed out waiting for the condition Feb 13 15:57:53.045886 kubelet[2920]: E0213 15:57:53.045866 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.045886 kubelet[2920]: W0213 15:57:53.045878 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.045886 kubelet[2920]: E0213 15:57:53.045892 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.046322 kubelet[2920]: E0213 15:57:53.045985 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.046322 kubelet[2920]: W0213 15:57:53.045990 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.046322 kubelet[2920]: E0213 15:57:53.045996 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.146426 kubelet[2920]: E0213 15:57:53.146406 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.146426 kubelet[2920]: W0213 15:57:53.146422 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.146573 kubelet[2920]: E0213 15:57:53.146436 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.146573 kubelet[2920]: E0213 15:57:53.146571 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.146626 kubelet[2920]: W0213 15:57:53.146576 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.146626 kubelet[2920]: E0213 15:57:53.146583 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.247302 kubelet[2920]: E0213 15:57:53.247283 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.247302 kubelet[2920]: W0213 15:57:53.247299 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.247436 kubelet[2920]: E0213 15:57:53.247311 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.247436 kubelet[2920]: E0213 15:57:53.247427 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.247436 kubelet[2920]: W0213 15:57:53.247432 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.247507 kubelet[2920]: E0213 15:57:53.247438 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.347799 kubelet[2920]: E0213 15:57:53.347773 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.347799 kubelet[2920]: W0213 15:57:53.347788 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.347799 kubelet[2920]: E0213 15:57:53.347802 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.348019 kubelet[2920]: E0213 15:57:53.347965 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.348019 kubelet[2920]: W0213 15:57:53.347972 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.348019 kubelet[2920]: E0213 15:57:53.347978 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.448831 kubelet[2920]: E0213 15:57:53.448807 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.448831 kubelet[2920]: W0213 15:57:53.448826 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.448831 kubelet[2920]: E0213 15:57:53.448842 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.449039 kubelet[2920]: E0213 15:57:53.448980 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.449039 kubelet[2920]: W0213 15:57:53.448986 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.449039 kubelet[2920]: E0213 15:57:53.448993 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.550314 kubelet[2920]: E0213 15:57:53.550147 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.550314 kubelet[2920]: W0213 15:57:53.550194 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.550314 kubelet[2920]: E0213 15:57:53.550210 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.550656 kubelet[2920]: E0213 15:57:53.550503 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.550656 kubelet[2920]: W0213 15:57:53.550512 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.550656 kubelet[2920]: E0213 15:57:53.550523 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.550819 kubelet[2920]: E0213 15:57:53.550692 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.550819 kubelet[2920]: W0213 15:57:53.550701 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.550819 kubelet[2920]: E0213 15:57:53.550723 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.551228 kubelet[2920]: E0213 15:57:53.551135 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.551228 kubelet[2920]: W0213 15:57:53.551144 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.551228 kubelet[2920]: E0213 15:57:53.551155 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.551424 kubelet[2920]: E0213 15:57:53.551295 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.551424 kubelet[2920]: W0213 15:57:53.551301 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.551424 kubelet[2920]: E0213 15:57:53.551312 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.551619 kubelet[2920]: E0213 15:57:53.551465 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.551619 kubelet[2920]: W0213 15:57:53.551473 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.551619 kubelet[2920]: E0213 15:57:53.551486 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.551894 kubelet[2920]: E0213 15:57:53.551811 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.551894 kubelet[2920]: W0213 15:57:53.551819 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.551894 kubelet[2920]: E0213 15:57:53.551831 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.552117 kubelet[2920]: E0213 15:57:53.551974 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.552117 kubelet[2920]: W0213 15:57:53.551981 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.552117 kubelet[2920]: E0213 15:57:53.551992 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.552358 kubelet[2920]: E0213 15:57:53.552221 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.552358 kubelet[2920]: W0213 15:57:53.552230 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.552358 kubelet[2920]: E0213 15:57:53.552241 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.552358 kubelet[2920]: E0213 15:57:53.552354 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.552460 kubelet[2920]: W0213 15:57:53.552360 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.552460 kubelet[2920]: E0213 15:57:53.552368 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.553035 kubelet[2920]: E0213 15:57:53.552969 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.553035 kubelet[2920]: W0213 15:57:53.552978 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.553035 kubelet[2920]: E0213 15:57:53.552986 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.553211 kubelet[2920]: E0213 15:57:53.553110 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:53.553211 kubelet[2920]: W0213 15:57:53.553126 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:53.553211 kubelet[2920]: E0213 15:57:53.553135 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:53.577877 containerd[1564]: time="2025-02-13T15:57:53.577848995Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54dd46576b-zpzpr,Uid:be8f398f-5d24-4965-97d0-de1342209263,Namespace:calico-system,Attempt:0,}" Feb 13 15:57:53.596951 containerd[1564]: time="2025-02-13T15:57:53.596898278Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:53.596951 containerd[1564]: time="2025-02-13T15:57:53.596929381Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:53.597239 containerd[1564]: time="2025-02-13T15:57:53.596940825Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:53.597239 containerd[1564]: time="2025-02-13T15:57:53.596981976Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:53.608663 systemd[1]: Started cri-containerd-de89e8fe8517477de7db0c8a63b120f175ece4d779d2f6981b37a7cd0fe594b8.scope - libcontainer container de89e8fe8517477de7db0c8a63b120f175ece4d779d2f6981b37a7cd0fe594b8. Feb 13 15:57:53.633810 containerd[1564]: time="2025-02-13T15:57:53.633740440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pwzpk,Uid:5217f836-48d0-4993-bcf9-2ed7a837a6e6,Namespace:calico-system,Attempt:0,}" Feb 13 15:57:53.642660 containerd[1564]: time="2025-02-13T15:57:53.642603534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54dd46576b-zpzpr,Uid:be8f398f-5d24-4965-97d0-de1342209263,Namespace:calico-system,Attempt:0,} returns sandbox id \"de89e8fe8517477de7db0c8a63b120f175ece4d779d2f6981b37a7cd0fe594b8\"" Feb 13 15:57:53.643677 containerd[1564]: time="2025-02-13T15:57:53.643660103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Feb 13 15:57:53.854332 containerd[1564]: time="2025-02-13T15:57:53.854157288Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:57:53.854332 containerd[1564]: time="2025-02-13T15:57:53.854222061Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:57:53.854332 containerd[1564]: time="2025-02-13T15:57:53.854237564Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:53.855322 containerd[1564]: time="2025-02-13T15:57:53.854600410Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:57:53.870657 systemd[1]: Started cri-containerd-541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5.scope - libcontainer container 541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5. Feb 13 15:57:53.886405 containerd[1564]: time="2025-02-13T15:57:53.886378956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-pwzpk,Uid:5217f836-48d0-4993-bcf9-2ed7a837a6e6,Namespace:calico-system,Attempt:0,} returns sandbox id \"541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5\"" Feb 13 15:57:53.982595 kubelet[2920]: E0213 15:57:53.982461 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:57:54.584334 systemd[1]: Started sshd@13-139.178.70.106:22-185.213.165.55:37198.service - OpenSSH per-connection server daemon (185.213.165.55:37198). Feb 13 15:57:55.314817 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1223800525.mount: Deactivated successfully. Feb 13 15:57:55.347020 systemd[1]: Started sshd@14-139.178.70.106:22-151.145.39.181:59854.service - OpenSSH per-connection server daemon (151.145.39.181:59854). Feb 13 15:57:55.778460 containerd[1564]: time="2025-02-13T15:57:55.778414381Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:55.779164 containerd[1564]: time="2025-02-13T15:57:55.779065329Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Feb 13 15:57:55.779385 containerd[1564]: time="2025-02-13T15:57:55.779369575Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:55.780471 containerd[1564]: time="2025-02-13T15:57:55.780455403Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:55.781088 containerd[1564]: time="2025-02-13T15:57:55.781072462Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.137394846s" Feb 13 15:57:55.781115 containerd[1564]: time="2025-02-13T15:57:55.781089515Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Feb 13 15:57:55.782168 containerd[1564]: time="2025-02-13T15:57:55.781774405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 15:57:55.790314 containerd[1564]: time="2025-02-13T15:57:55.790000099Z" level=info msg="CreateContainer within sandbox \"de89e8fe8517477de7db0c8a63b120f175ece4d779d2f6981b37a7cd0fe594b8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 15:57:55.800163 containerd[1564]: time="2025-02-13T15:57:55.800139328Z" level=info msg="CreateContainer within sandbox \"de89e8fe8517477de7db0c8a63b120f175ece4d779d2f6981b37a7cd0fe594b8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9d31f0749e1092bcbd22f33eb20b7ab020d9290aaceb53213250824ee3316f44\"" Feb 13 15:57:55.800713 containerd[1564]: time="2025-02-13T15:57:55.800473904Z" level=info msg="StartContainer for \"9d31f0749e1092bcbd22f33eb20b7ab020d9290aaceb53213250824ee3316f44\"" Feb 13 15:57:55.848657 systemd[1]: Started cri-containerd-9d31f0749e1092bcbd22f33eb20b7ab020d9290aaceb53213250824ee3316f44.scope - libcontainer container 9d31f0749e1092bcbd22f33eb20b7ab020d9290aaceb53213250824ee3316f44. Feb 13 15:57:55.881965 containerd[1564]: time="2025-02-13T15:57:55.881936694Z" level=info msg="StartContainer for \"9d31f0749e1092bcbd22f33eb20b7ab020d9290aaceb53213250824ee3316f44\" returns successfully" Feb 13 15:57:55.942742 sshd[3500]: Invalid user jabber from 151.145.39.181 port 59854 Feb 13 15:57:55.981836 kubelet[2920]: E0213 15:57:55.981804 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:57:55.985993 sshd[3493]: Invalid user ffff from 185.213.165.55 port 37198 Feb 13 15:57:56.039609 sshd[3500]: Received disconnect from 151.145.39.181 port 59854:11: Bye Bye [preauth] Feb 13 15:57:56.039609 sshd[3500]: Disconnected from invalid user jabber 151.145.39.181 port 59854 [preauth] Feb 13 15:57:56.040239 systemd[1]: sshd@14-139.178.70.106:22-151.145.39.181:59854.service: Deactivated successfully. Feb 13 15:57:56.054810 kubelet[2920]: E0213 15:57:56.054761 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.054810 kubelet[2920]: W0213 15:57:56.054779 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.055043 kubelet[2920]: E0213 15:57:56.054792 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.055165 kubelet[2920]: E0213 15:57:56.055147 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.055287 kubelet[2920]: W0213 15:57:56.055208 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.055287 kubelet[2920]: E0213 15:57:56.055223 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.055558 kubelet[2920]: E0213 15:57:56.055484 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.055558 kubelet[2920]: W0213 15:57:56.055504 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.055558 kubelet[2920]: E0213 15:57:56.055514 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.055830 kubelet[2920]: E0213 15:57:56.055755 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.055830 kubelet[2920]: W0213 15:57:56.055764 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.055830 kubelet[2920]: E0213 15:57:56.055771 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.056110 kubelet[2920]: E0213 15:57:56.055961 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.056110 kubelet[2920]: W0213 15:57:56.056056 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.056110 kubelet[2920]: E0213 15:57:56.056065 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.056367 kubelet[2920]: E0213 15:57:56.056273 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.056367 kubelet[2920]: W0213 15:57:56.056312 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.056367 kubelet[2920]: E0213 15:57:56.056320 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.056630 kubelet[2920]: E0213 15:57:56.056544 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.056630 kubelet[2920]: W0213 15:57:56.056571 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.056630 kubelet[2920]: E0213 15:57:56.056579 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.056869 kubelet[2920]: E0213 15:57:56.056775 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.056869 kubelet[2920]: W0213 15:57:56.056782 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.056869 kubelet[2920]: E0213 15:57:56.056789 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.057094 kubelet[2920]: E0213 15:57:56.057032 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.057094 kubelet[2920]: W0213 15:57:56.057042 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.057094 kubelet[2920]: E0213 15:57:56.057050 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.057301 kubelet[2920]: E0213 15:57:56.057242 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.057301 kubelet[2920]: W0213 15:57:56.057250 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.057301 kubelet[2920]: E0213 15:57:56.057257 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.057534 kubelet[2920]: E0213 15:57:56.057472 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.057534 kubelet[2920]: W0213 15:57:56.057481 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.057534 kubelet[2920]: E0213 15:57:56.057488 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.057786 kubelet[2920]: E0213 15:57:56.057722 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.057786 kubelet[2920]: W0213 15:57:56.057730 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.057786 kubelet[2920]: E0213 15:57:56.057737 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.058037 kubelet[2920]: E0213 15:57:56.057939 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.058037 kubelet[2920]: W0213 15:57:56.057947 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.058037 kubelet[2920]: E0213 15:57:56.057956 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.058258 kubelet[2920]: E0213 15:57:56.058155 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.058258 kubelet[2920]: W0213 15:57:56.058163 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.058258 kubelet[2920]: E0213 15:57:56.058170 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.058453 kubelet[2920]: E0213 15:57:56.058388 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.058453 kubelet[2920]: W0213 15:57:56.058397 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.058453 kubelet[2920]: E0213 15:57:56.058404 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.064510 kubelet[2920]: I0213 15:57:56.064471 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-54dd46576b-zpzpr" podStartSLOduration=2.9263194329999997 podStartE2EDuration="5.064455139s" podCreationTimestamp="2025-02-13 15:57:51 +0000 UTC" firstStartedPulling="2025-02-13 15:57:53.643388938 +0000 UTC m=+22.728088912" lastFinishedPulling="2025-02-13 15:57:55.781508584 +0000 UTC m=+24.866224618" observedRunningTime="2025-02-13 15:57:56.063866304 +0000 UTC m=+25.148566296" watchObservedRunningTime="2025-02-13 15:57:56.064455139 +0000 UTC m=+25.149155120" Feb 13 15:57:56.070782 kubelet[2920]: E0213 15:57:56.070680 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.070782 kubelet[2920]: W0213 15:57:56.070697 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.070782 kubelet[2920]: E0213 15:57:56.070711 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.071164 kubelet[2920]: E0213 15:57:56.071047 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.071164 kubelet[2920]: W0213 15:57:56.071056 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.071164 kubelet[2920]: E0213 15:57:56.071080 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.071804 kubelet[2920]: E0213 15:57:56.071238 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.071804 kubelet[2920]: W0213 15:57:56.071244 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.071804 kubelet[2920]: E0213 15:57:56.071252 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.071804 kubelet[2920]: E0213 15:57:56.071574 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.071804 kubelet[2920]: W0213 15:57:56.071584 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.071804 kubelet[2920]: E0213 15:57:56.071597 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.071804 kubelet[2920]: E0213 15:57:56.071716 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.071804 kubelet[2920]: W0213 15:57:56.071725 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.071804 kubelet[2920]: E0213 15:57:56.071737 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.072163 kubelet[2920]: E0213 15:57:56.071848 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.072163 kubelet[2920]: W0213 15:57:56.071854 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.072163 kubelet[2920]: E0213 15:57:56.071862 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.072163 kubelet[2920]: E0213 15:57:56.071963 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.072163 kubelet[2920]: W0213 15:57:56.071968 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.072163 kubelet[2920]: E0213 15:57:56.071977 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.072380 kubelet[2920]: E0213 15:57:56.072324 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.072380 kubelet[2920]: W0213 15:57:56.072332 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.072380 kubelet[2920]: E0213 15:57:56.072344 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.072599 kubelet[2920]: E0213 15:57:56.072437 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.072599 kubelet[2920]: W0213 15:57:56.072442 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.072599 kubelet[2920]: E0213 15:57:56.072450 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.072599 kubelet[2920]: E0213 15:57:56.072597 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.072867 kubelet[2920]: W0213 15:57:56.072603 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.072867 kubelet[2920]: E0213 15:57:56.072612 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.072867 kubelet[2920]: E0213 15:57:56.072718 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.072867 kubelet[2920]: W0213 15:57:56.072723 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.072867 kubelet[2920]: E0213 15:57:56.072731 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.073144 kubelet[2920]: E0213 15:57:56.073019 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.073144 kubelet[2920]: W0213 15:57:56.073027 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.073144 kubelet[2920]: E0213 15:57:56.073041 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.073224 kubelet[2920]: E0213 15:57:56.073205 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.073224 kubelet[2920]: W0213 15:57:56.073211 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.073224 kubelet[2920]: E0213 15:57:56.073218 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.073358 kubelet[2920]: E0213 15:57:56.073315 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.073358 kubelet[2920]: W0213 15:57:56.073320 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.073358 kubelet[2920]: E0213 15:57:56.073326 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.073521 kubelet[2920]: E0213 15:57:56.073433 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.073521 kubelet[2920]: W0213 15:57:56.073438 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.073521 kubelet[2920]: E0213 15:57:56.073444 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.073757 kubelet[2920]: E0213 15:57:56.073666 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.073757 kubelet[2920]: W0213 15:57:56.073673 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.073757 kubelet[2920]: E0213 15:57:56.073683 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.073873 kubelet[2920]: E0213 15:57:56.073866 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.073928 kubelet[2920]: W0213 15:57:56.073910 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.074000 kubelet[2920]: E0213 15:57:56.073971 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.074092 kubelet[2920]: E0213 15:57:56.074081 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:56.074092 kubelet[2920]: W0213 15:57:56.074090 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:56.074149 kubelet[2920]: E0213 15:57:56.074097 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:56.248249 sshd[3493]: Received disconnect from 185.213.165.55 port 37198:11: Bye Bye [preauth] Feb 13 15:57:56.248249 sshd[3493]: Disconnected from invalid user ffff 185.213.165.55 port 37198 [preauth] Feb 13 15:57:56.248944 systemd[1]: sshd@13-139.178.70.106:22-185.213.165.55:37198.service: Deactivated successfully. Feb 13 15:57:57.049799 kubelet[2920]: I0213 15:57:57.049772 2920 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 15:57:57.066627 kubelet[2920]: E0213 15:57:57.066606 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.067071 kubelet[2920]: W0213 15:57:57.066924 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.067071 kubelet[2920]: E0213 15:57:57.066950 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.067256 kubelet[2920]: E0213 15:57:57.067184 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.067256 kubelet[2920]: W0213 15:57:57.067195 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.067256 kubelet[2920]: E0213 15:57:57.067202 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.067505 kubelet[2920]: E0213 15:57:57.067400 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.067505 kubelet[2920]: W0213 15:57:57.067409 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.067505 kubelet[2920]: E0213 15:57:57.067417 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.067730 kubelet[2920]: E0213 15:57:57.067659 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.067730 kubelet[2920]: W0213 15:57:57.067669 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.067730 kubelet[2920]: E0213 15:57:57.067675 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.067911 kubelet[2920]: E0213 15:57:57.067820 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.067911 kubelet[2920]: W0213 15:57:57.067833 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.067911 kubelet[2920]: E0213 15:57:57.067841 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.068205 kubelet[2920]: E0213 15:57:57.068107 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.068205 kubelet[2920]: W0213 15:57:57.068114 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.068205 kubelet[2920]: E0213 15:57:57.068120 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.068454 kubelet[2920]: E0213 15:57:57.068385 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.068454 kubelet[2920]: W0213 15:57:57.068393 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.068454 kubelet[2920]: E0213 15:57:57.068401 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.068996 kubelet[2920]: E0213 15:57:57.068521 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.068996 kubelet[2920]: W0213 15:57:57.068527 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.068996 kubelet[2920]: E0213 15:57:57.068535 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.068996 kubelet[2920]: E0213 15:57:57.068794 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.068996 kubelet[2920]: W0213 15:57:57.068833 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.068996 kubelet[2920]: E0213 15:57:57.068843 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.069223 kubelet[2920]: E0213 15:57:57.069157 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.069223 kubelet[2920]: W0213 15:57:57.069165 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.069223 kubelet[2920]: E0213 15:57:57.069174 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.069396 kubelet[2920]: E0213 15:57:57.069336 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.069396 kubelet[2920]: W0213 15:57:57.069344 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.069396 kubelet[2920]: E0213 15:57:57.069351 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.069499 kubelet[2920]: E0213 15:57:57.069492 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.069533 kubelet[2920]: W0213 15:57:57.069528 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.069626 kubelet[2920]: E0213 15:57:57.069579 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.069735 kubelet[2920]: E0213 15:57:57.069681 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.069735 kubelet[2920]: W0213 15:57:57.069687 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.069735 kubelet[2920]: E0213 15:57:57.069692 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.069823 kubelet[2920]: E0213 15:57:57.069818 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.069852 kubelet[2920]: W0213 15:57:57.069848 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.069888 kubelet[2920]: E0213 15:57:57.069882 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.070013 kubelet[2920]: E0213 15:57:57.070007 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.070079 kubelet[2920]: W0213 15:57:57.070042 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.070079 kubelet[2920]: E0213 15:57:57.070050 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.078348 kubelet[2920]: E0213 15:57:57.078326 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.078348 kubelet[2920]: W0213 15:57:57.078341 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.078348 kubelet[2920]: E0213 15:57:57.078354 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.078603 kubelet[2920]: E0213 15:57:57.078479 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.078603 kubelet[2920]: W0213 15:57:57.078485 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.078603 kubelet[2920]: E0213 15:57:57.078492 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.078603 kubelet[2920]: E0213 15:57:57.078598 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.078603 kubelet[2920]: W0213 15:57:57.078603 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.078862 kubelet[2920]: E0213 15:57:57.078610 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.079013 kubelet[2920]: E0213 15:57:57.078930 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.079013 kubelet[2920]: W0213 15:57:57.078939 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.079013 kubelet[2920]: E0213 15:57:57.078950 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.079243 kubelet[2920]: E0213 15:57:57.079170 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.079243 kubelet[2920]: W0213 15:57:57.079181 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.079243 kubelet[2920]: E0213 15:57:57.079199 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.079459 kubelet[2920]: E0213 15:57:57.079376 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.079459 kubelet[2920]: W0213 15:57:57.079382 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.079459 kubelet[2920]: E0213 15:57:57.079394 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.079688 kubelet[2920]: E0213 15:57:57.079579 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.079688 kubelet[2920]: W0213 15:57:57.079588 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.079688 kubelet[2920]: E0213 15:57:57.079600 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.079774 kubelet[2920]: E0213 15:57:57.079745 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.079774 kubelet[2920]: W0213 15:57:57.079750 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.079774 kubelet[2920]: E0213 15:57:57.079756 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.079853 kubelet[2920]: E0213 15:57:57.079840 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.079853 kubelet[2920]: W0213 15:57:57.079845 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.079853 kubelet[2920]: E0213 15:57:57.079850 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.080005 kubelet[2920]: E0213 15:57:57.079925 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.080005 kubelet[2920]: W0213 15:57:57.079932 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.080005 kubelet[2920]: E0213 15:57:57.079939 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.080189 kubelet[2920]: E0213 15:57:57.080032 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.080189 kubelet[2920]: W0213 15:57:57.080037 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.080189 kubelet[2920]: E0213 15:57:57.080042 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.080382 kubelet[2920]: E0213 15:57:57.080288 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.080382 kubelet[2920]: W0213 15:57:57.080296 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.080382 kubelet[2920]: E0213 15:57:57.080313 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.080579 kubelet[2920]: E0213 15:57:57.080501 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.080579 kubelet[2920]: W0213 15:57:57.080507 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.080579 kubelet[2920]: E0213 15:57:57.080517 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.080811 kubelet[2920]: E0213 15:57:57.080731 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.080811 kubelet[2920]: W0213 15:57:57.080740 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.080811 kubelet[2920]: E0213 15:57:57.080757 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.081086 kubelet[2920]: E0213 15:57:57.080976 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.081086 kubelet[2920]: W0213 15:57:57.080985 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.081086 kubelet[2920]: E0213 15:57:57.080998 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.081160 kubelet[2920]: E0213 15:57:57.081143 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.081160 kubelet[2920]: W0213 15:57:57.081149 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.081160 kubelet[2920]: E0213 15:57:57.081154 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.081314 kubelet[2920]: E0213 15:57:57.081244 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.081314 kubelet[2920]: W0213 15:57:57.081252 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.081314 kubelet[2920]: E0213 15:57:57.081260 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.081449 kubelet[2920]: E0213 15:57:57.081375 2920 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 15:57:57.081449 kubelet[2920]: W0213 15:57:57.081381 2920 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 15:57:57.081449 kubelet[2920]: E0213 15:57:57.081389 2920 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 15:57:57.234930 containerd[1564]: time="2025-02-13T15:57:57.234903874Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:57.235561 containerd[1564]: time="2025-02-13T15:57:57.235522735Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Feb 13 15:57:57.235786 containerd[1564]: time="2025-02-13T15:57:57.235770687Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:57.237537 containerd[1564]: time="2025-02-13T15:57:57.237047782Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:57:57.237537 containerd[1564]: time="2025-02-13T15:57:57.237477575Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.455322454s" Feb 13 15:57:57.237537 containerd[1564]: time="2025-02-13T15:57:57.237493121Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 15:57:57.239245 containerd[1564]: time="2025-02-13T15:57:57.238971900Z" level=info msg="CreateContainer within sandbox \"541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 15:57:57.248405 containerd[1564]: time="2025-02-13T15:57:57.248378473Z" level=info msg="CreateContainer within sandbox \"541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6\"" Feb 13 15:57:57.249575 containerd[1564]: time="2025-02-13T15:57:57.249463632Z" level=info msg="StartContainer for \"ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6\"" Feb 13 15:57:57.271635 systemd[1]: Started cri-containerd-ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6.scope - libcontainer container ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6. Feb 13 15:57:57.299114 systemd[1]: cri-containerd-ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6.scope: Deactivated successfully. Feb 13 15:57:57.300419 containerd[1564]: time="2025-02-13T15:57:57.299815615Z" level=info msg="StartContainer for \"ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6\" returns successfully" Feb 13 15:57:57.785976 systemd[1]: run-containerd-runc-k8s.io-ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6-runc.ZRk6by.mount: Deactivated successfully. Feb 13 15:57:57.786053 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6-rootfs.mount: Deactivated successfully. Feb 13 15:57:57.877638 containerd[1564]: time="2025-02-13T15:57:57.871572808Z" level=info msg="shim disconnected" id=ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6 namespace=k8s.io Feb 13 15:57:57.877638 containerd[1564]: time="2025-02-13T15:57:57.877571078Z" level=warning msg="cleaning up after shim disconnected" id=ff4aa3e72c0298fcf498ab96a6597a3cb5c56e66c048f46f6fee26a1d61b29e6 namespace=k8s.io Feb 13 15:57:57.877638 containerd[1564]: time="2025-02-13T15:57:57.877578430Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:57:57.982283 kubelet[2920]: E0213 15:57:57.982046 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:57:58.053268 containerd[1564]: time="2025-02-13T15:57:58.053189738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 15:57:59.981952 kubelet[2920]: E0213 15:57:59.981911 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:58:01.509857 containerd[1564]: time="2025-02-13T15:58:01.509358249Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:01.509857 containerd[1564]: time="2025-02-13T15:58:01.509738424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 15:58:01.509857 containerd[1564]: time="2025-02-13T15:58:01.509823346Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:01.511761 containerd[1564]: time="2025-02-13T15:58:01.511742620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:01.512107 containerd[1564]: time="2025-02-13T15:58:01.512094703Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 3.458067734s" Feb 13 15:58:01.512340 containerd[1564]: time="2025-02-13T15:58:01.512224206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 15:58:01.514275 containerd[1564]: time="2025-02-13T15:58:01.514255564Z" level=info msg="CreateContainer within sandbox \"541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 15:58:01.532755 containerd[1564]: time="2025-02-13T15:58:01.532730497Z" level=info msg="CreateContainer within sandbox \"541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684\"" Feb 13 15:58:01.533954 containerd[1564]: time="2025-02-13T15:58:01.533135785Z" level=info msg="StartContainer for \"bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684\"" Feb 13 15:58:01.579642 systemd[1]: Started cri-containerd-bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684.scope - libcontainer container bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684. Feb 13 15:58:01.597882 containerd[1564]: time="2025-02-13T15:58:01.597835897Z" level=info msg="StartContainer for \"bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684\" returns successfully" Feb 13 15:58:01.982287 kubelet[2920]: E0213 15:58:01.982183 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:58:02.898537 systemd[1]: cri-containerd-bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684.scope: Deactivated successfully. Feb 13 15:58:02.898749 systemd[1]: cri-containerd-bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684.scope: Consumed 257ms CPU time, 147.4M memory peak, 28K read from disk, 151M written to disk. Feb 13 15:58:02.917839 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684-rootfs.mount: Deactivated successfully. Feb 13 15:58:02.921780 containerd[1564]: time="2025-02-13T15:58:02.921138176Z" level=info msg="shim disconnected" id=bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684 namespace=k8s.io Feb 13 15:58:02.921780 containerd[1564]: time="2025-02-13T15:58:02.921172596Z" level=warning msg="cleaning up after shim disconnected" id=bf0310b5159e64815f7afadb3a25141ec6f06bc4723e89b6ddb81b4c7e5c1684 namespace=k8s.io Feb 13 15:58:02.921780 containerd[1564]: time="2025-02-13T15:58:02.921178333Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 15:58:02.981641 kubelet[2920]: I0213 15:58:02.981624 2920 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Feb 13 15:58:03.018823 kubelet[2920]: I0213 15:58:03.018768 2920 topology_manager.go:215] "Topology Admit Handler" podUID="3fabfa82-f1ab-4e56-bc9e-31febf370fec" podNamespace="calico-system" podName="calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:03.026050 kubelet[2920]: I0213 15:58:03.025992 2920 topology_manager.go:215] "Topology Admit Handler" podUID="440fcad6-af3c-4b75-a27e-d1a967a963e9" podNamespace="kube-system" podName="coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:03.026484 kubelet[2920]: I0213 15:58:03.026214 2920 topology_manager.go:215] "Topology Admit Handler" podUID="a494df23-70e9-451c-8266-c2382d1a2d64" podNamespace="kube-system" podName="coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:03.026484 kubelet[2920]: I0213 15:58:03.026272 2920 topology_manager.go:215] "Topology Admit Handler" podUID="01cdc21b-7e9c-4a56-99bc-b4069f009602" podNamespace="calico-apiserver" podName="calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:03.026484 kubelet[2920]: I0213 15:58:03.026326 2920 topology_manager.go:215] "Topology Admit Handler" podUID="d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4" podNamespace="calico-apiserver" podName="calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:03.060115 systemd[1]: Created slice kubepods-besteffort-podd8a4eee4_dc8a_4fd7_97cb_aaa8332159e4.slice - libcontainer container kubepods-besteffort-podd8a4eee4_dc8a_4fd7_97cb_aaa8332159e4.slice. Feb 13 15:58:03.064626 containerd[1564]: time="2025-02-13T15:58:03.064437847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 15:58:03.069042 systemd[1]: Created slice kubepods-besteffort-pod3fabfa82_f1ab_4e56_bc9e_31febf370fec.slice - libcontainer container kubepods-besteffort-pod3fabfa82_f1ab_4e56_bc9e_31febf370fec.slice. Feb 13 15:58:03.075765 systemd[1]: Created slice kubepods-burstable-poda494df23_70e9_451c_8266_c2382d1a2d64.slice - libcontainer container kubepods-burstable-poda494df23_70e9_451c_8266_c2382d1a2d64.slice. Feb 13 15:58:03.084785 systemd[1]: Created slice kubepods-burstable-pod440fcad6_af3c_4b75_a27e_d1a967a963e9.slice - libcontainer container kubepods-burstable-pod440fcad6_af3c_4b75_a27e_d1a967a963e9.slice. Feb 13 15:58:03.091500 systemd[1]: Created slice kubepods-besteffort-pod01cdc21b_7e9c_4a56_99bc_b4069f009602.slice - libcontainer container kubepods-besteffort-pod01cdc21b_7e9c_4a56_99bc_b4069f009602.slice. Feb 13 15:58:03.117054 kubelet[2920]: I0213 15:58:03.117022 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3fabfa82-f1ab-4e56-bc9e-31febf370fec-tigera-ca-bundle\") pod \"calico-kube-controllers-5ddbbb799c-mbdnx\" (UID: \"3fabfa82-f1ab-4e56-bc9e-31febf370fec\") " pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:03.117229 kubelet[2920]: I0213 15:58:03.117218 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/01cdc21b-7e9c-4a56-99bc-b4069f009602-calico-apiserver-certs\") pod \"calico-apiserver-8dd87f54d-92htb\" (UID: \"01cdc21b-7e9c-4a56-99bc-b4069f009602\") " pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:03.117665 kubelet[2920]: I0213 15:58:03.117282 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a494df23-70e9-451c-8266-c2382d1a2d64-config-volume\") pod \"coredns-7db6d8ff4d-c8kpg\" (UID: \"a494df23-70e9-451c-8266-c2382d1a2d64\") " pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:03.117665 kubelet[2920]: I0213 15:58:03.117316 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6qgc\" (UniqueName: \"kubernetes.io/projected/3fabfa82-f1ab-4e56-bc9e-31febf370fec-kube-api-access-p6qgc\") pod \"calico-kube-controllers-5ddbbb799c-mbdnx\" (UID: \"3fabfa82-f1ab-4e56-bc9e-31febf370fec\") " pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:03.117665 kubelet[2920]: I0213 15:58:03.117331 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frnfg\" (UniqueName: \"kubernetes.io/projected/01cdc21b-7e9c-4a56-99bc-b4069f009602-kube-api-access-frnfg\") pod \"calico-apiserver-8dd87f54d-92htb\" (UID: \"01cdc21b-7e9c-4a56-99bc-b4069f009602\") " pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:03.117665 kubelet[2920]: I0213 15:58:03.117351 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbt2h\" (UniqueName: \"kubernetes.io/projected/440fcad6-af3c-4b75-a27e-d1a967a963e9-kube-api-access-zbt2h\") pod \"coredns-7db6d8ff4d-jnfjq\" (UID: \"440fcad6-af3c-4b75-a27e-d1a967a963e9\") " pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:03.117665 kubelet[2920]: I0213 15:58:03.117369 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj2mp\" (UniqueName: \"kubernetes.io/projected/a494df23-70e9-451c-8266-c2382d1a2d64-kube-api-access-sj2mp\") pod \"coredns-7db6d8ff4d-c8kpg\" (UID: \"a494df23-70e9-451c-8266-c2382d1a2d64\") " pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:03.117790 kubelet[2920]: I0213 15:58:03.117380 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4-calico-apiserver-certs\") pod \"calico-apiserver-8dd87f54d-nz7vq\" (UID: \"d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4\") " pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:03.117836 kubelet[2920]: I0213 15:58:03.117651 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn7kf\" (UniqueName: \"kubernetes.io/projected/d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4-kube-api-access-rn7kf\") pod \"calico-apiserver-8dd87f54d-nz7vq\" (UID: \"d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4\") " pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:03.117899 kubelet[2920]: I0213 15:58:03.117881 2920 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/440fcad6-af3c-4b75-a27e-d1a967a963e9-config-volume\") pod \"coredns-7db6d8ff4d-jnfjq\" (UID: \"440fcad6-af3c-4b75-a27e-d1a967a963e9\") " pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:03.365376 containerd[1564]: time="2025-02-13T15:58:03.365345882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:0,}" Feb 13 15:58:03.379889 containerd[1564]: time="2025-02-13T15:58:03.379742897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:0,}" Feb 13 15:58:03.382285 containerd[1564]: time="2025-02-13T15:58:03.382217277Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:0,}" Feb 13 15:58:03.392243 containerd[1564]: time="2025-02-13T15:58:03.391971927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:0,}" Feb 13 15:58:03.394067 containerd[1564]: time="2025-02-13T15:58:03.394051334Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:0,}" Feb 13 15:58:03.675981 containerd[1564]: time="2025-02-13T15:58:03.675897702Z" level=error msg="Failed to destroy network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.678136 containerd[1564]: time="2025-02-13T15:58:03.676973420Z" level=error msg="Failed to destroy network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.679203 containerd[1564]: time="2025-02-13T15:58:03.679160706Z" level=error msg="encountered an error cleaning up failed sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.679268 containerd[1564]: time="2025-02-13T15:58:03.679217938Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.679694 containerd[1564]: time="2025-02-13T15:58:03.679673710Z" level=error msg="encountered an error cleaning up failed sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.680373 containerd[1564]: time="2025-02-13T15:58:03.679711147Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.688358 kubelet[2920]: E0213 15:58:03.682719 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.688492 kubelet[2920]: E0213 15:58:03.682720 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.689778 kubelet[2920]: E0213 15:58:03.689754 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:03.689778 kubelet[2920]: E0213 15:58:03.689776 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:03.689883 kubelet[2920]: E0213 15:58:03.689812 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" podUID="01cdc21b-7e9c-4a56-99bc-b4069f009602" Feb 13 15:58:03.689937 kubelet[2920]: E0213 15:58:03.689905 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:03.689937 kubelet[2920]: E0213 15:58:03.689917 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:03.689978 kubelet[2920]: E0213 15:58:03.689933 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-c8kpg" podUID="a494df23-70e9-451c-8266-c2382d1a2d64" Feb 13 15:58:03.692643 containerd[1564]: time="2025-02-13T15:58:03.692615165Z" level=error msg="Failed to destroy network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.692880 containerd[1564]: time="2025-02-13T15:58:03.692864606Z" level=error msg="encountered an error cleaning up failed sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.692911 containerd[1564]: time="2025-02-13T15:58:03.692902459Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.693060 kubelet[2920]: E0213 15:58:03.693030 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.693122 kubelet[2920]: E0213 15:58:03.693113 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:03.693170 kubelet[2920]: E0213 15:58:03.693161 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:03.693232 kubelet[2920]: E0213 15:58:03.693219 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" podUID="d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4" Feb 13 15:58:03.693633 containerd[1564]: time="2025-02-13T15:58:03.693616909Z" level=error msg="Failed to destroy network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.693823 containerd[1564]: time="2025-02-13T15:58:03.693807592Z" level=error msg="encountered an error cleaning up failed sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.693853 containerd[1564]: time="2025-02-13T15:58:03.693841782Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.693991 kubelet[2920]: E0213 15:58:03.693922 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.693991 kubelet[2920]: E0213 15:58:03.693944 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:03.693991 kubelet[2920]: E0213 15:58:03.693954 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:03.694061 kubelet[2920]: E0213 15:58:03.693971 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jnfjq" podUID="440fcad6-af3c-4b75-a27e-d1a967a963e9" Feb 13 15:58:03.695453 containerd[1564]: time="2025-02-13T15:58:03.695436159Z" level=error msg="Failed to destroy network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.695752 containerd[1564]: time="2025-02-13T15:58:03.695710764Z" level=error msg="encountered an error cleaning up failed sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.695836 containerd[1564]: time="2025-02-13T15:58:03.695822507Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.696049 kubelet[2920]: E0213 15:58:03.696023 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:03.696087 kubelet[2920]: E0213 15:58:03.696056 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:03.696087 kubelet[2920]: E0213 15:58:03.696070 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:03.696147 kubelet[2920]: E0213 15:58:03.696102 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" podUID="3fabfa82-f1ab-4e56-bc9e-31febf370fec" Feb 13 15:58:03.919077 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849-shm.mount: Deactivated successfully. Feb 13 15:58:03.986067 systemd[1]: Created slice kubepods-besteffort-podb126212a_e016_4060_9fc3_97a9a5142c06.slice - libcontainer container kubepods-besteffort-podb126212a_e016_4060_9fc3_97a9a5142c06.slice. Feb 13 15:58:03.987954 containerd[1564]: time="2025-02-13T15:58:03.987900796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:0,}" Feb 13 15:58:04.039575 containerd[1564]: time="2025-02-13T15:58:04.039441764Z" level=error msg="Failed to destroy network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.040019 containerd[1564]: time="2025-02-13T15:58:04.039923392Z" level=error msg="encountered an error cleaning up failed sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.040019 containerd[1564]: time="2025-02-13T15:58:04.039988518Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.040725 kubelet[2920]: E0213 15:58:04.040683 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.041006 kubelet[2920]: E0213 15:58:04.040741 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:04.041006 kubelet[2920]: E0213 15:58:04.040755 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:04.041006 kubelet[2920]: E0213 15:58:04.040786 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:58:04.042038 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a-shm.mount: Deactivated successfully. Feb 13 15:58:04.065901 kubelet[2920]: I0213 15:58:04.065871 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9" Feb 13 15:58:04.068031 kubelet[2920]: I0213 15:58:04.067619 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849" Feb 13 15:58:04.080800 kubelet[2920]: I0213 15:58:04.080403 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a" Feb 13 15:58:04.081598 kubelet[2920]: I0213 15:58:04.081003 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a" Feb 13 15:58:04.082564 kubelet[2920]: I0213 15:58:04.082440 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97" Feb 13 15:58:04.083094 kubelet[2920]: I0213 15:58:04.083078 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6" Feb 13 15:58:04.111479 containerd[1564]: time="2025-02-13T15:58:04.111441950Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:04.111639 containerd[1564]: time="2025-02-13T15:58:04.111604983Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:04.113799 containerd[1564]: time="2025-02-13T15:58:04.113766174Z" level=info msg="Ensure that sandbox e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6 in task-service has been cleanup successfully" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.113969900Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.113982522Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.114052512Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.114134708Z" level=info msg="Ensure that sandbox e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a in task-service has been cleanup successfully" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.114254615Z" level=info msg="TearDown network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" successfully" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.114262177Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" returns successfully" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.114290448Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:04.114434 containerd[1564]: time="2025-02-13T15:58:04.114368895Z" level=info msg="Ensure that sandbox 84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a in task-service has been cleanup successfully" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115026365Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115089277Z" level=info msg="Ensure that sandbox 7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9 in task-service has been cleanup successfully" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115116220Z" level=info msg="Ensure that sandbox e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97 in task-service has been cleanup successfully" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115277963Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115375972Z" level=info msg="Ensure that sandbox 375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849 in task-service has been cleanup successfully" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115748841Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115757638Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115782842Z" level=info msg="TearDown network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" successfully" Feb 13 15:58:04.115857 containerd[1564]: time="2025-02-13T15:58:04.115788632Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" returns successfully" Feb 13 15:58:04.116127 containerd[1564]: time="2025-02-13T15:58:04.116095297Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:04.116127 containerd[1564]: time="2025-02-13T15:58:04.116105644Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:04.116225 containerd[1564]: time="2025-02-13T15:58:04.116199870Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:04.116225 containerd[1564]: time="2025-02-13T15:58:04.116208768Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:04.116511 systemd[1]: run-netns-cni\x2df14ece0b\x2dc660\x2d0027\x2dd752\x2d8c57975631ea.mount: Deactivated successfully. Feb 13 15:58:04.116581 systemd[1]: run-netns-cni\x2d0d1e8c6e\x2dc957\x2d2bf3\x2d676d\x2de6e18eeff41c.mount: Deactivated successfully. Feb 13 15:58:04.116623 systemd[1]: run-netns-cni\x2daeeebe9c\x2d9cfe\x2d47b4\x2d6178\x2d7d2ed9ec197e.mount: Deactivated successfully. Feb 13 15:58:04.116660 systemd[1]: run-netns-cni\x2db360666b\x2d60a3\x2d415f\x2d8784\x2df5d0537f8f1f.mount: Deactivated successfully. Feb 13 15:58:04.118070 containerd[1564]: time="2025-02-13T15:58:04.117082101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:1,}" Feb 13 15:58:04.118070 containerd[1564]: time="2025-02-13T15:58:04.117615913Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:1,}" Feb 13 15:58:04.118070 containerd[1564]: time="2025-02-13T15:58:04.117761464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:1,}" Feb 13 15:58:04.118070 containerd[1564]: time="2025-02-13T15:58:04.117863188Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:1,}" Feb 13 15:58:04.118070 containerd[1564]: time="2025-02-13T15:58:04.118060631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:1,}" Feb 13 15:58:04.118196 containerd[1564]: time="2025-02-13T15:58:04.118171609Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:1,}" Feb 13 15:58:04.119950 systemd[1]: run-netns-cni\x2d68ba3cce\x2dbbdf\x2dcfe6\x2d7302\x2d9eef266055c8.mount: Deactivated successfully. Feb 13 15:58:04.120011 systemd[1]: run-netns-cni\x2dd485b225\x2d05af\x2d995e\x2dd830\x2dad642caac779.mount: Deactivated successfully. Feb 13 15:58:04.226910 containerd[1564]: time="2025-02-13T15:58:04.226885008Z" level=error msg="Failed to destroy network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.227291 containerd[1564]: time="2025-02-13T15:58:04.227197649Z" level=error msg="encountered an error cleaning up failed sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.227291 containerd[1564]: time="2025-02-13T15:58:04.227237641Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.227462 kubelet[2920]: E0213 15:58:04.227440 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.227499 kubelet[2920]: E0213 15:58:04.227479 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:04.227748 kubelet[2920]: E0213 15:58:04.227494 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:04.227748 kubelet[2920]: E0213 15:58:04.227527 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:58:04.266244 containerd[1564]: time="2025-02-13T15:58:04.265986614Z" level=error msg="Failed to destroy network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.266244 containerd[1564]: time="2025-02-13T15:58:04.266233552Z" level=error msg="encountered an error cleaning up failed sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.266345 containerd[1564]: time="2025-02-13T15:58:04.266283558Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.266478 kubelet[2920]: E0213 15:58:04.266456 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.266579 kubelet[2920]: E0213 15:58:04.266568 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:04.268876 kubelet[2920]: E0213 15:58:04.268016 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:04.268876 kubelet[2920]: E0213 15:58:04.268120 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jnfjq" podUID="440fcad6-af3c-4b75-a27e-d1a967a963e9" Feb 13 15:58:04.271660 containerd[1564]: time="2025-02-13T15:58:04.271635151Z" level=error msg="Failed to destroy network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.271976 containerd[1564]: time="2025-02-13T15:58:04.271855891Z" level=error msg="encountered an error cleaning up failed sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.271976 containerd[1564]: time="2025-02-13T15:58:04.271904297Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.272048 kubelet[2920]: E0213 15:58:04.272012 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.272086 kubelet[2920]: E0213 15:58:04.272042 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:04.272086 kubelet[2920]: E0213 15:58:04.272057 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:04.272127 kubelet[2920]: E0213 15:58:04.272087 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" podUID="01cdc21b-7e9c-4a56-99bc-b4069f009602" Feb 13 15:58:04.273117 containerd[1564]: time="2025-02-13T15:58:04.272850465Z" level=error msg="Failed to destroy network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.273117 containerd[1564]: time="2025-02-13T15:58:04.273053490Z" level=error msg="encountered an error cleaning up failed sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.273117 containerd[1564]: time="2025-02-13T15:58:04.273077098Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.273233 kubelet[2920]: E0213 15:58:04.273194 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.273233 kubelet[2920]: E0213 15:58:04.273216 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:04.273233 kubelet[2920]: E0213 15:58:04.273227 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:04.273323 kubelet[2920]: E0213 15:58:04.273248 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-c8kpg" podUID="a494df23-70e9-451c-8266-c2382d1a2d64" Feb 13 15:58:04.273687 containerd[1564]: time="2025-02-13T15:58:04.273648110Z" level=error msg="Failed to destroy network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.273932 containerd[1564]: time="2025-02-13T15:58:04.273914575Z" level=error msg="encountered an error cleaning up failed sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.273969 containerd[1564]: time="2025-02-13T15:58:04.273941197Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.274343 kubelet[2920]: E0213 15:58:04.274120 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.274343 kubelet[2920]: E0213 15:58:04.274149 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:04.274343 kubelet[2920]: E0213 15:58:04.274163 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:04.274418 kubelet[2920]: E0213 15:58:04.274181 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" podUID="3fabfa82-f1ab-4e56-bc9e-31febf370fec" Feb 13 15:58:04.274715 containerd[1564]: time="2025-02-13T15:58:04.274698282Z" level=error msg="Failed to destroy network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.274891 containerd[1564]: time="2025-02-13T15:58:04.274873886Z" level=error msg="encountered an error cleaning up failed sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.274941 containerd[1564]: time="2025-02-13T15:58:04.274924158Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.275121 kubelet[2920]: E0213 15:58:04.275036 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:04.275121 kubelet[2920]: E0213 15:58:04.275058 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:04.275121 kubelet[2920]: E0213 15:58:04.275068 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:04.275425 kubelet[2920]: E0213 15:58:04.275098 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" podUID="d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4" Feb 13 15:58:05.085293 kubelet[2920]: I0213 15:58:05.085267 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856" Feb 13 15:58:05.086188 containerd[1564]: time="2025-02-13T15:58:05.085961782Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:05.086188 containerd[1564]: time="2025-02-13T15:58:05.086114802Z" level=info msg="Ensure that sandbox 9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856 in task-service has been cleanup successfully" Feb 13 15:58:05.088780 systemd[1]: run-netns-cni\x2d3694a0d9\x2db20e\x2dacfd\x2d553a\x2d1998df8bcb53.mount: Deactivated successfully. Feb 13 15:58:05.089654 containerd[1564]: time="2025-02-13T15:58:05.087071644Z" level=info msg="TearDown network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" successfully" Feb 13 15:58:05.089654 containerd[1564]: time="2025-02-13T15:58:05.089345968Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" returns successfully" Feb 13 15:58:05.090181 kubelet[2920]: I0213 15:58:05.089834 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a" Feb 13 15:58:05.090230 containerd[1564]: time="2025-02-13T15:58:05.090200687Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:05.090279 containerd[1564]: time="2025-02-13T15:58:05.090260792Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:05.090279 containerd[1564]: time="2025-02-13T15:58:05.090272694Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:05.091006 containerd[1564]: time="2025-02-13T15:58:05.090955987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:2,}" Feb 13 15:58:05.091505 kubelet[2920]: I0213 15:58:05.091287 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8" Feb 13 15:58:05.091642 containerd[1564]: time="2025-02-13T15:58:05.091622673Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" Feb 13 15:58:05.091982 containerd[1564]: time="2025-02-13T15:58:05.091962341Z" level=info msg="Ensure that sandbox cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8 in task-service has been cleanup successfully" Feb 13 15:58:05.092265 containerd[1564]: time="2025-02-13T15:58:05.092246509Z" level=info msg="TearDown network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" successfully" Feb 13 15:58:05.092265 containerd[1564]: time="2025-02-13T15:58:05.092260288Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" returns successfully" Feb 13 15:58:05.092598 containerd[1564]: time="2025-02-13T15:58:05.092509873Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:05.092642 containerd[1564]: time="2025-02-13T15:58:05.092621127Z" level=info msg="TearDown network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" successfully" Feb 13 15:58:05.093119 containerd[1564]: time="2025-02-13T15:58:05.092630540Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" returns successfully" Feb 13 15:58:05.093119 containerd[1564]: time="2025-02-13T15:58:05.093014251Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:2,}" Feb 13 15:58:05.093174 kubelet[2920]: I0213 15:58:05.092888 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6" Feb 13 15:58:05.093202 containerd[1564]: time="2025-02-13T15:58:05.093128763Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" Feb 13 15:58:05.093523 containerd[1564]: time="2025-02-13T15:58:05.093440873Z" level=info msg="Ensure that sandbox 744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6 in task-service has been cleanup successfully" Feb 13 15:58:05.093893 containerd[1564]: time="2025-02-13T15:58:05.093868223Z" level=info msg="TearDown network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" successfully" Feb 13 15:58:05.093893 containerd[1564]: time="2025-02-13T15:58:05.093882208Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" returns successfully" Feb 13 15:58:05.094991 containerd[1564]: time="2025-02-13T15:58:05.094125496Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:05.094991 containerd[1564]: time="2025-02-13T15:58:05.094175934Z" level=info msg="TearDown network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" successfully" Feb 13 15:58:05.094991 containerd[1564]: time="2025-02-13T15:58:05.094183861Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" returns successfully" Feb 13 15:58:05.102911 kubelet[2920]: I0213 15:58:05.094721 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512" Feb 13 15:58:05.102911 kubelet[2920]: I0213 15:58:05.098263 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb" Feb 13 15:58:05.097232 systemd[1]: run-netns-cni\x2d362be102\x2dde5a\x2de47c\x2debb0\x2d927c3e1de501.mount: Deactivated successfully. Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.094993026Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.095139843Z" level=info msg="Ensure that sandbox d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512 in task-service has been cleanup successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.095271414Z" level=info msg="TearDown network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.095281726Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" returns successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.095352562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:2,}" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.096226622Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.096296563Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.096309537Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.097809804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:2,}" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.098764225Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.099405734Z" level=info msg="Ensure that sandbox 7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb in task-service has been cleanup successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.099682801Z" level=info msg="TearDown network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.099695316Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" returns successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.099942504Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.100059165Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.100070243Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:05.103024 containerd[1564]: time="2025-02-13T15:58:05.100302453Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:2,}" Feb 13 15:58:05.097304 systemd[1]: run-netns-cni\x2d78cf6e14\x2d69f3\x2d653e\x2df69e\x2dbcc7e9e23ac5.mount: Deactivated successfully. Feb 13 15:58:05.101836 systemd[1]: run-netns-cni\x2db13cceaa\x2d4951\x2d7489\x2d3fb5\x2dd88074b33b54.mount: Deactivated successfully. Feb 13 15:58:05.101902 systemd[1]: run-netns-cni\x2db1427ff1\x2d3966\x2db119\x2d18ed\x2d35aa0fde311f.mount: Deactivated successfully. Feb 13 15:58:05.110749 containerd[1564]: time="2025-02-13T15:58:05.110727382Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:05.110885 containerd[1564]: time="2025-02-13T15:58:05.110869665Z" level=info msg="Ensure that sandbox c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a in task-service has been cleanup successfully" Feb 13 15:58:05.122417 containerd[1564]: time="2025-02-13T15:58:05.111024064Z" level=info msg="TearDown network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" successfully" Feb 13 15:58:05.122417 containerd[1564]: time="2025-02-13T15:58:05.111034025Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" returns successfully" Feb 13 15:58:05.122417 containerd[1564]: time="2025-02-13T15:58:05.111250956Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:05.122417 containerd[1564]: time="2025-02-13T15:58:05.111310878Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:05.122417 containerd[1564]: time="2025-02-13T15:58:05.111316878Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:05.122417 containerd[1564]: time="2025-02-13T15:58:05.111563994Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:2,}" Feb 13 15:58:05.624413 containerd[1564]: time="2025-02-13T15:58:05.624341422Z" level=error msg="Failed to destroy network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.629848 containerd[1564]: time="2025-02-13T15:58:05.629793283Z" level=error msg="encountered an error cleaning up failed sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.629848 containerd[1564]: time="2025-02-13T15:58:05.629841957Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.630019 kubelet[2920]: E0213 15:58:05.629974 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.630019 kubelet[2920]: E0213 15:58:05.630006 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:05.630148 kubelet[2920]: E0213 15:58:05.630021 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:05.630148 kubelet[2920]: E0213 15:58:05.630057 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" podUID="3fabfa82-f1ab-4e56-bc9e-31febf370fec" Feb 13 15:58:05.633559 containerd[1564]: time="2025-02-13T15:58:05.633449165Z" level=error msg="Failed to destroy network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.634122 containerd[1564]: time="2025-02-13T15:58:05.634052372Z" level=error msg="encountered an error cleaning up failed sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.634353 containerd[1564]: time="2025-02-13T15:58:05.634091802Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.634653 kubelet[2920]: E0213 15:58:05.634463 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.634653 kubelet[2920]: E0213 15:58:05.634581 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:05.634653 kubelet[2920]: E0213 15:58:05.634596 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:05.634752 kubelet[2920]: E0213 15:58:05.634622 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" podUID="d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4" Feb 13 15:58:05.643673 containerd[1564]: time="2025-02-13T15:58:05.643583866Z" level=error msg="Failed to destroy network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.644276 containerd[1564]: time="2025-02-13T15:58:05.644100205Z" level=error msg="encountered an error cleaning up failed sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.646639 containerd[1564]: time="2025-02-13T15:58:05.646609955Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.646796 kubelet[2920]: E0213 15:58:05.646761 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.646927 kubelet[2920]: E0213 15:58:05.646812 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:05.646927 kubelet[2920]: E0213 15:58:05.646827 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:05.646927 kubelet[2920]: E0213 15:58:05.646859 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jnfjq" podUID="440fcad6-af3c-4b75-a27e-d1a967a963e9" Feb 13 15:58:05.651460 containerd[1564]: time="2025-02-13T15:58:05.651434492Z" level=error msg="Failed to destroy network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.651821 containerd[1564]: time="2025-02-13T15:58:05.651806871Z" level=error msg="encountered an error cleaning up failed sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.652599 containerd[1564]: time="2025-02-13T15:58:05.652584084Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.652909 kubelet[2920]: E0213 15:58:05.652796 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.652909 kubelet[2920]: E0213 15:58:05.652841 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:05.652909 kubelet[2920]: E0213 15:58:05.652855 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:05.652985 kubelet[2920]: E0213 15:58:05.652888 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" podUID="01cdc21b-7e9c-4a56-99bc-b4069f009602" Feb 13 15:58:05.659476 containerd[1564]: time="2025-02-13T15:58:05.659383994Z" level=error msg="Failed to destroy network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.659705 containerd[1564]: time="2025-02-13T15:58:05.659605016Z" level=error msg="encountered an error cleaning up failed sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.659705 containerd[1564]: time="2025-02-13T15:58:05.659641449Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.659878 kubelet[2920]: E0213 15:58:05.659838 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.659920 kubelet[2920]: E0213 15:58:05.659904 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:05.659949 kubelet[2920]: E0213 15:58:05.659918 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:05.660126 kubelet[2920]: E0213 15:58:05.659981 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:58:05.669137 containerd[1564]: time="2025-02-13T15:58:05.669112691Z" level=error msg="Failed to destroy network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.669655 containerd[1564]: time="2025-02-13T15:58:05.669641320Z" level=error msg="encountered an error cleaning up failed sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.669843 containerd[1564]: time="2025-02-13T15:58:05.669831648Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.670566 kubelet[2920]: E0213 15:58:05.670112 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:05.670566 kubelet[2920]: E0213 15:58:05.670160 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:05.670566 kubelet[2920]: E0213 15:58:05.670173 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:05.670652 kubelet[2920]: E0213 15:58:05.670207 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-c8kpg" podUID="a494df23-70e9-451c-8266-c2382d1a2d64" Feb 13 15:58:05.920827 systemd[1]: run-netns-cni\x2da87df21d\x2d57b0\x2d5187\x2de7ae\x2dd5ac5a46a901.mount: Deactivated successfully. Feb 13 15:58:06.324571 kubelet[2920]: I0213 15:58:06.324481 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9" Feb 13 15:58:06.340920 containerd[1564]: time="2025-02-13T15:58:06.340887220Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" Feb 13 15:58:06.341250 containerd[1564]: time="2025-02-13T15:58:06.341079398Z" level=info msg="Ensure that sandbox 0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9 in task-service has been cleanup successfully" Feb 13 15:58:06.342749 containerd[1564]: time="2025-02-13T15:58:06.342668045Z" level=info msg="TearDown network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" successfully" Feb 13 15:58:06.342749 containerd[1564]: time="2025-02-13T15:58:06.342681993Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" returns successfully" Feb 13 15:58:06.343726 systemd[1]: run-netns-cni\x2d9656f5c9\x2d2b8c\x2da7c1\x2d7175\x2d05ecf24da5fa.mount: Deactivated successfully. Feb 13 15:58:06.379782 containerd[1564]: time="2025-02-13T15:58:06.379704766Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:06.379782 containerd[1564]: time="2025-02-13T15:58:06.379770534Z" level=info msg="TearDown network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" successfully" Feb 13 15:58:06.379782 containerd[1564]: time="2025-02-13T15:58:06.379777512Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" returns successfully" Feb 13 15:58:06.382683 containerd[1564]: time="2025-02-13T15:58:06.382540403Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:06.382683 containerd[1564]: time="2025-02-13T15:58:06.382594329Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:06.382683 containerd[1564]: time="2025-02-13T15:58:06.382602299Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:06.382941 containerd[1564]: time="2025-02-13T15:58:06.382924078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:3,}" Feb 13 15:58:06.385030 kubelet[2920]: I0213 15:58:06.384959 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae" Feb 13 15:58:06.392246 kubelet[2920]: I0213 15:58:06.391894 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a" Feb 13 15:58:06.388629 systemd[1]: run-netns-cni\x2dcf03da20\x2da11a\x2dc873\x2dda0d\x2d81b2430fa371.mount: Deactivated successfully. Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.385386900Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.385511361Z" level=info msg="Ensure that sandbox 44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae in task-service has been cleanup successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.388967683Z" level=info msg="TearDown network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.388981641Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" returns successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.390331026Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.390391742Z" level=info msg="TearDown network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.390398902Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" returns successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.391160466Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.391202766Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.391212423Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.392397620Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.393143261Z" level=info msg="Ensure that sandbox 09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a in task-service has been cleanup successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.393661191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:3,}" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.395392219Z" level=info msg="TearDown network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.395402588Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" returns successfully" Feb 13 15:58:06.400647 containerd[1564]: time="2025-02-13T15:58:06.397456631Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:06.395219 systemd[1]: run-netns-cni\x2d27748014\x2d88ae\x2dd2bb\x2d6e8f\x2d1a20ce3c2db3.mount: Deactivated successfully. Feb 13 15:58:06.424573 kubelet[2920]: I0213 15:58:06.415377 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48" Feb 13 15:58:06.424573 kubelet[2920]: I0213 15:58:06.423669 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.404736830Z" level=info msg="TearDown network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.404751224Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" returns successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.414235406Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.414314976Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.414324652Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.414868679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:3,}" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.415762902Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\"" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.416463962Z" level=info msg="Ensure that sandbox 60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48 in task-service has been cleanup successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.418631512Z" level=info msg="TearDown network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.418654621Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" returns successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.419282384Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.419467052Z" level=info msg="TearDown network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.419476278Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" returns successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.421236802Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.421290046Z" level=info msg="TearDown network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.421298940Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" returns successfully" Feb 13 15:58:06.424632 containerd[1564]: time="2025-02-13T15:58:06.423412138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:3,}" Feb 13 15:58:06.418766 systemd[1]: run-netns-cni\x2d1169191d\x2d51c2\x2d396f\x2d3ca8\x2df85073d9d8e7.mount: Deactivated successfully. Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.426518378Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\"" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.427336574Z" level=info msg="Ensure that sandbox c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db in task-service has been cleanup successfully" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.427507861Z" level=info msg="TearDown network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" successfully" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.427516608Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" returns successfully" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.427643896Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.427690606Z" level=info msg="TearDown network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" successfully" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.427696735Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" returns successfully" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.428453097Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.428487966Z" level=info msg="TearDown network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" successfully" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.428495442Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" returns successfully" Feb 13 15:58:06.431957 containerd[1564]: time="2025-02-13T15:58:06.430259101Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:3,}" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.436143639Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.436267361Z" level=info msg="Ensure that sandbox e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc in task-service has been cleanup successfully" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.436805419Z" level=info msg="TearDown network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" successfully" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.436815629Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" returns successfully" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.436958996Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.437032448Z" level=info msg="TearDown network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" successfully" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.437043593Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" returns successfully" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.437176406Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.437231426Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.437239623Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:06.466562 containerd[1564]: time="2025-02-13T15:58:06.437416344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:3,}" Feb 13 15:58:06.466748 kubelet[2920]: I0213 15:58:06.435156 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc" Feb 13 15:58:06.918329 systemd[1]: run-netns-cni\x2d7a0d2b93\x2df85f\x2d7dae\x2df477\x2d1c5ee06fc72e.mount: Deactivated successfully. Feb 13 15:58:06.918653 systemd[1]: run-netns-cni\x2d80f35111\x2db6d2\x2d8bea\x2d35f7\x2deb1d795abc62.mount: Deactivated successfully. Feb 13 15:58:07.597980 containerd[1564]: time="2025-02-13T15:58:07.597779110Z" level=error msg="Failed to destroy network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.598940 containerd[1564]: time="2025-02-13T15:58:07.598467157Z" level=error msg="encountered an error cleaning up failed sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.598940 containerd[1564]: time="2025-02-13T15:58:07.598503414Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.599138 kubelet[2920]: E0213 15:58:07.598712 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.599138 kubelet[2920]: E0213 15:58:07.598749 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:07.599138 kubelet[2920]: E0213 15:58:07.598762 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:07.599322 kubelet[2920]: E0213 15:58:07.598789 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" podUID="01cdc21b-7e9c-4a56-99bc-b4069f009602" Feb 13 15:58:07.662996 containerd[1564]: time="2025-02-13T15:58:07.662961956Z" level=error msg="Failed to destroy network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.664185 containerd[1564]: time="2025-02-13T15:58:07.664170373Z" level=error msg="encountered an error cleaning up failed sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.664698 containerd[1564]: time="2025-02-13T15:58:07.664307533Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.665249 kubelet[2920]: E0213 15:58:07.664904 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.665249 kubelet[2920]: E0213 15:58:07.664947 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:07.665249 kubelet[2920]: E0213 15:58:07.664961 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:07.665330 kubelet[2920]: E0213 15:58:07.664987 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jnfjq" podUID="440fcad6-af3c-4b75-a27e-d1a967a963e9" Feb 13 15:58:07.687729 containerd[1564]: time="2025-02-13T15:58:07.687704803Z" level=error msg="Failed to destroy network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.690881 containerd[1564]: time="2025-02-13T15:58:07.688005327Z" level=error msg="encountered an error cleaning up failed sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.690881 containerd[1564]: time="2025-02-13T15:58:07.688053228Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.691095 kubelet[2920]: E0213 15:58:07.688196 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.691095 kubelet[2920]: E0213 15:58:07.688245 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:07.691095 kubelet[2920]: E0213 15:58:07.688259 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:07.692467 kubelet[2920]: E0213 15:58:07.688311 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" podUID="d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4" Feb 13 15:58:07.713435 containerd[1564]: time="2025-02-13T15:58:07.713355976Z" level=error msg="Failed to destroy network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.716823 containerd[1564]: time="2025-02-13T15:58:07.713877763Z" level=error msg="encountered an error cleaning up failed sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.716823 containerd[1564]: time="2025-02-13T15:58:07.713931360Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.716823 containerd[1564]: time="2025-02-13T15:58:07.715993770Z" level=error msg="Failed to destroy network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.716823 containerd[1564]: time="2025-02-13T15:58:07.716277902Z" level=error msg="encountered an error cleaning up failed sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.716823 containerd[1564]: time="2025-02-13T15:58:07.716325442Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.725980 kubelet[2920]: E0213 15:58:07.714057 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.725980 kubelet[2920]: E0213 15:58:07.714090 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:07.725980 kubelet[2920]: E0213 15:58:07.714102 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:07.746775 containerd[1564]: time="2025-02-13T15:58:07.722687069Z" level=error msg="Failed to destroy network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.746775 containerd[1564]: time="2025-02-13T15:58:07.722927800Z" level=error msg="encountered an error cleaning up failed sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.746775 containerd[1564]: time="2025-02-13T15:58:07.723016770Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.752111 kubelet[2920]: E0213 15:58:07.714125 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:58:07.752111 kubelet[2920]: E0213 15:58:07.716429 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.752111 kubelet[2920]: E0213 15:58:07.716454 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:07.788662 kubelet[2920]: E0213 15:58:07.716465 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:07.788662 kubelet[2920]: E0213 15:58:07.716486 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-c8kpg" podUID="a494df23-70e9-451c-8266-c2382d1a2d64" Feb 13 15:58:07.788662 kubelet[2920]: E0213 15:58:07.724008 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:07.788775 kubelet[2920]: E0213 15:58:07.724040 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:07.788775 kubelet[2920]: E0213 15:58:07.724058 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:07.788775 kubelet[2920]: E0213 15:58:07.724086 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" podUID="3fabfa82-f1ab-4e56-bc9e-31febf370fec" Feb 13 15:58:07.920364 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b-shm.mount: Deactivated successfully. Feb 13 15:58:07.920428 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614-shm.mount: Deactivated successfully. Feb 13 15:58:08.308004 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3694209773.mount: Deactivated successfully. Feb 13 15:58:08.440002 kubelet[2920]: I0213 15:58:08.439280 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc" Feb 13 15:58:08.444033 containerd[1564]: time="2025-02-13T15:58:08.440249571Z" level=info msg="StopPodSandbox for \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\"" Feb 13 15:58:08.449712 containerd[1564]: time="2025-02-13T15:58:08.449685895Z" level=info msg="Ensure that sandbox 98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc in task-service has been cleanup successfully" Feb 13 15:58:08.451395 containerd[1564]: time="2025-02-13T15:58:08.449882829Z" level=info msg="TearDown network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" successfully" Feb 13 15:58:08.451395 containerd[1564]: time="2025-02-13T15:58:08.449905638Z" level=info msg="StopPodSandbox for \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" returns successfully" Feb 13 15:58:08.451213 systemd[1]: run-netns-cni\x2df08f96fe\x2d0230\x2d59d4\x2d0bd0\x2df8080d1961c6.mount: Deactivated successfully. Feb 13 15:58:08.451782 containerd[1564]: time="2025-02-13T15:58:08.451734350Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\"" Feb 13 15:58:08.451832 containerd[1564]: time="2025-02-13T15:58:08.451807300Z" level=info msg="TearDown network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" successfully" Feb 13 15:58:08.451832 containerd[1564]: time="2025-02-13T15:58:08.451822847Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" returns successfully" Feb 13 15:58:08.452233 containerd[1564]: time="2025-02-13T15:58:08.452173118Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" Feb 13 15:58:08.452265 containerd[1564]: time="2025-02-13T15:58:08.452247255Z" level=info msg="TearDown network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" successfully" Feb 13 15:58:08.452265 containerd[1564]: time="2025-02-13T15:58:08.452260438Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" returns successfully" Feb 13 15:58:08.452574 containerd[1564]: time="2025-02-13T15:58:08.452541842Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:08.452714 containerd[1564]: time="2025-02-13T15:58:08.452624324Z" level=info msg="TearDown network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" successfully" Feb 13 15:58:08.452714 containerd[1564]: time="2025-02-13T15:58:08.452635225Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" returns successfully" Feb 13 15:58:08.453210 containerd[1564]: time="2025-02-13T15:58:08.453013152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:4,}" Feb 13 15:58:08.453913 kubelet[2920]: I0213 15:58:08.453895 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162" Feb 13 15:58:08.459286 containerd[1564]: time="2025-02-13T15:58:08.459056289Z" level=info msg="StopPodSandbox for \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\"" Feb 13 15:58:08.459286 containerd[1564]: time="2025-02-13T15:58:08.459190687Z" level=info msg="Ensure that sandbox 415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162 in task-service has been cleanup successfully" Feb 13 15:58:08.460513 systemd[1]: run-netns-cni\x2d04db833f\x2d7144\x2d3f18\x2d7564\x2d6e4b89ed09bd.mount: Deactivated successfully. Feb 13 15:58:08.460618 containerd[1564]: time="2025-02-13T15:58:08.460574095Z" level=info msg="TearDown network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" successfully" Feb 13 15:58:08.460618 containerd[1564]: time="2025-02-13T15:58:08.460586385Z" level=info msg="StopPodSandbox for \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" returns successfully" Feb 13 15:58:08.461019 containerd[1564]: time="2025-02-13T15:58:08.460996521Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\"" Feb 13 15:58:08.461052 containerd[1564]: time="2025-02-13T15:58:08.461038334Z" level=info msg="TearDown network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" successfully" Feb 13 15:58:08.461052 containerd[1564]: time="2025-02-13T15:58:08.461044825Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" returns successfully" Feb 13 15:58:08.461400 containerd[1564]: time="2025-02-13T15:58:08.461331824Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" Feb 13 15:58:08.461400 containerd[1564]: time="2025-02-13T15:58:08.461389874Z" level=info msg="TearDown network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" successfully" Feb 13 15:58:08.461400 containerd[1564]: time="2025-02-13T15:58:08.461400445Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" returns successfully" Feb 13 15:58:08.461888 containerd[1564]: time="2025-02-13T15:58:08.461623084Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:08.461888 containerd[1564]: time="2025-02-13T15:58:08.461666235Z" level=info msg="TearDown network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" successfully" Feb 13 15:58:08.461888 containerd[1564]: time="2025-02-13T15:58:08.461673324Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" returns successfully" Feb 13 15:58:08.461957 kubelet[2920]: I0213 15:58:08.461880 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e" Feb 13 15:58:08.462505 containerd[1564]: time="2025-02-13T15:58:08.462412628Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\"" Feb 13 15:58:08.463775 containerd[1564]: time="2025-02-13T15:58:08.462708933Z" level=info msg="Ensure that sandbox 449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e in task-service has been cleanup successfully" Feb 13 15:58:08.463982 containerd[1564]: time="2025-02-13T15:58:08.463964786Z" level=info msg="TearDown network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" successfully" Feb 13 15:58:08.464289 containerd[1564]: time="2025-02-13T15:58:08.464029943Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" returns successfully" Feb 13 15:58:08.464289 containerd[1564]: time="2025-02-13T15:58:08.464170273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:4,}" Feb 13 15:58:08.465174 systemd[1]: run-netns-cni\x2dc2cf3cda\x2dc03b\x2d45a7\x2d6412\x2d256f19e8e9ec.mount: Deactivated successfully. Feb 13 15:58:08.465469 containerd[1564]: time="2025-02-13T15:58:08.465455279Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" Feb 13 15:58:08.465583 containerd[1564]: time="2025-02-13T15:58:08.465508977Z" level=info msg="TearDown network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" successfully" Feb 13 15:58:08.465583 containerd[1564]: time="2025-02-13T15:58:08.465516811Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" returns successfully" Feb 13 15:58:08.465863 containerd[1564]: time="2025-02-13T15:58:08.465727400Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:08.465863 containerd[1564]: time="2025-02-13T15:58:08.465763121Z" level=info msg="TearDown network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" successfully" Feb 13 15:58:08.465863 containerd[1564]: time="2025-02-13T15:58:08.465769458Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" returns successfully" Feb 13 15:58:08.466523 containerd[1564]: time="2025-02-13T15:58:08.465936450Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:08.466523 containerd[1564]: time="2025-02-13T15:58:08.465975041Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:08.466523 containerd[1564]: time="2025-02-13T15:58:08.465985007Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:08.466523 containerd[1564]: time="2025-02-13T15:58:08.466227798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:4,}" Feb 13 15:58:08.466714 kubelet[2920]: I0213 15:58:08.466508 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614" Feb 13 15:58:08.467284 containerd[1564]: time="2025-02-13T15:58:08.466908209Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\"" Feb 13 15:58:08.467284 containerd[1564]: time="2025-02-13T15:58:08.467032112Z" level=info msg="Ensure that sandbox 2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614 in task-service has been cleanup successfully" Feb 13 15:58:08.467284 containerd[1564]: time="2025-02-13T15:58:08.467149220Z" level=info msg="TearDown network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" successfully" Feb 13 15:58:08.467284 containerd[1564]: time="2025-02-13T15:58:08.467157223Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" returns successfully" Feb 13 15:58:08.467864 containerd[1564]: time="2025-02-13T15:58:08.467534302Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" Feb 13 15:58:08.467864 containerd[1564]: time="2025-02-13T15:58:08.467789022Z" level=info msg="TearDown network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" successfully" Feb 13 15:58:08.467864 containerd[1564]: time="2025-02-13T15:58:08.467797709Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" returns successfully" Feb 13 15:58:08.468470 containerd[1564]: time="2025-02-13T15:58:08.468172875Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:08.468761 containerd[1564]: time="2025-02-13T15:58:08.468529989Z" level=info msg="TearDown network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" successfully" Feb 13 15:58:08.468761 containerd[1564]: time="2025-02-13T15:58:08.468538090Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" returns successfully" Feb 13 15:58:08.469038 containerd[1564]: time="2025-02-13T15:58:08.469009873Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:08.469087 containerd[1564]: time="2025-02-13T15:58:08.469057787Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:08.469087 containerd[1564]: time="2025-02-13T15:58:08.469065605Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:08.469457 containerd[1564]: time="2025-02-13T15:58:08.469431645Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:4,}" Feb 13 15:58:08.470271 containerd[1564]: time="2025-02-13T15:58:08.470249752Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\"" Feb 13 15:58:08.470308 kubelet[2920]: I0213 15:58:08.469867 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b" Feb 13 15:58:08.470832 containerd[1564]: time="2025-02-13T15:58:08.470661993Z" level=info msg="Ensure that sandbox ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b in task-service has been cleanup successfully" Feb 13 15:58:08.470832 containerd[1564]: time="2025-02-13T15:58:08.470809963Z" level=info msg="TearDown network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" successfully" Feb 13 15:58:08.470832 containerd[1564]: time="2025-02-13T15:58:08.470818250Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" returns successfully" Feb 13 15:58:08.471099 containerd[1564]: time="2025-02-13T15:58:08.470971079Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" Feb 13 15:58:08.471099 containerd[1564]: time="2025-02-13T15:58:08.471011927Z" level=info msg="TearDown network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" successfully" Feb 13 15:58:08.471099 containerd[1564]: time="2025-02-13T15:58:08.471023518Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" returns successfully" Feb 13 15:58:08.471304 containerd[1564]: time="2025-02-13T15:58:08.471290263Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:08.471333 containerd[1564]: time="2025-02-13T15:58:08.471328243Z" level=info msg="TearDown network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" successfully" Feb 13 15:58:08.471357 containerd[1564]: time="2025-02-13T15:58:08.471333921Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" returns successfully" Feb 13 15:58:08.471655 containerd[1564]: time="2025-02-13T15:58:08.471630912Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:08.471714 containerd[1564]: time="2025-02-13T15:58:08.471681414Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:08.471714 containerd[1564]: time="2025-02-13T15:58:08.471687911Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:08.472374 containerd[1564]: time="2025-02-13T15:58:08.472358185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:4,}" Feb 13 15:58:08.472778 kubelet[2920]: I0213 15:58:08.472637 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48" Feb 13 15:58:08.473373 containerd[1564]: time="2025-02-13T15:58:08.473153781Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\"" Feb 13 15:58:08.473933 containerd[1564]: time="2025-02-13T15:58:08.473897480Z" level=info msg="Ensure that sandbox 45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48 in task-service has been cleanup successfully" Feb 13 15:58:08.474159 containerd[1564]: time="2025-02-13T15:58:08.474115201Z" level=info msg="TearDown network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" successfully" Feb 13 15:58:08.474159 containerd[1564]: time="2025-02-13T15:58:08.474125744Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" returns successfully" Feb 13 15:58:08.474460 containerd[1564]: time="2025-02-13T15:58:08.474446648Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" Feb 13 15:58:08.474497 containerd[1564]: time="2025-02-13T15:58:08.474485053Z" level=info msg="TearDown network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" successfully" Feb 13 15:58:08.474497 containerd[1564]: time="2025-02-13T15:58:08.474491045Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" returns successfully" Feb 13 15:58:08.474922 containerd[1564]: time="2025-02-13T15:58:08.474849272Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:08.474922 containerd[1564]: time="2025-02-13T15:58:08.474892089Z" level=info msg="TearDown network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" successfully" Feb 13 15:58:08.474922 containerd[1564]: time="2025-02-13T15:58:08.474898190Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" returns successfully" Feb 13 15:58:08.475688 containerd[1564]: time="2025-02-13T15:58:08.475670684Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:08.475725 containerd[1564]: time="2025-02-13T15:58:08.475712498Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:08.475725 containerd[1564]: time="2025-02-13T15:58:08.475719681Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:08.475981 containerd[1564]: time="2025-02-13T15:58:08.475966923Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:4,}" Feb 13 15:58:08.557370 containerd[1564]: time="2025-02-13T15:58:08.557226817Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:08.569362 containerd[1564]: time="2025-02-13T15:58:08.567446004Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 15:58:08.621686 containerd[1564]: time="2025-02-13T15:58:08.621661724Z" level=error msg="Failed to destroy network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.623418 containerd[1564]: time="2025-02-13T15:58:08.623403030Z" level=error msg="encountered an error cleaning up failed sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.623526 containerd[1564]: time="2025-02-13T15:58:08.623510473Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.624013 kubelet[2920]: E0213 15:58:08.623798 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.624013 kubelet[2920]: E0213 15:58:08.623836 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:08.624013 kubelet[2920]: E0213 15:58:08.623848 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:08.624234 kubelet[2920]: E0213 15:58:08.623873 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" podUID="d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4" Feb 13 15:58:08.640206 containerd[1564]: time="2025-02-13T15:58:08.640177081Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:08.662226 containerd[1564]: time="2025-02-13T15:58:08.662199985Z" level=error msg="Failed to destroy network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.662525 containerd[1564]: time="2025-02-13T15:58:08.662510976Z" level=error msg="encountered an error cleaning up failed sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.662614 containerd[1564]: time="2025-02-13T15:58:08.662602261Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.662744 containerd[1564]: time="2025-02-13T15:58:08.662722971Z" level=error msg="Failed to destroy network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.662948 containerd[1564]: time="2025-02-13T15:58:08.662935522Z" level=error msg="encountered an error cleaning up failed sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.663048 containerd[1564]: time="2025-02-13T15:58:08.663035413Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.664463 kubelet[2920]: E0213 15:58:08.664207 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.664463 kubelet[2920]: E0213 15:58:08.664265 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.664463 kubelet[2920]: E0213 15:58:08.664279 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:08.664463 kubelet[2920]: E0213 15:58:08.664291 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:08.664602 kubelet[2920]: E0213 15:58:08.664328 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" podUID="01cdc21b-7e9c-4a56-99bc-b4069f009602" Feb 13 15:58:08.664659 kubelet[2920]: E0213 15:58:08.664648 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:08.664743 kubelet[2920]: E0213 15:58:08.664733 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-c8kpg" Feb 13 15:58:08.664817 kubelet[2920]: E0213 15:58:08.664797 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-c8kpg_kube-system(a494df23-70e9-451c-8266-c2382d1a2d64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-c8kpg" podUID="a494df23-70e9-451c-8266-c2382d1a2d64" Feb 13 15:58:08.664956 containerd[1564]: time="2025-02-13T15:58:08.664928826Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:08.671312 containerd[1564]: time="2025-02-13T15:58:08.671287095Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 5.601358348s" Feb 13 15:58:08.671411 containerd[1564]: time="2025-02-13T15:58:08.671401916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 15:58:08.681174 containerd[1564]: time="2025-02-13T15:58:08.681100477Z" level=error msg="Failed to destroy network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.681609 containerd[1564]: time="2025-02-13T15:58:08.681595339Z" level=error msg="encountered an error cleaning up failed sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.681699 containerd[1564]: time="2025-02-13T15:58:08.681686000Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.682107 kubelet[2920]: E0213 15:58:08.681917 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.682107 kubelet[2920]: E0213 15:58:08.681951 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:08.682107 kubelet[2920]: E0213 15:58:08.681963 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fz6cz" Feb 13 15:58:08.682210 kubelet[2920]: E0213 15:58:08.681986 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fz6cz_calico-system(b126212a-e016-4060-9fc3-97a9a5142c06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fz6cz" podUID="b126212a-e016-4060-9fc3-97a9a5142c06" Feb 13 15:58:08.708940 containerd[1564]: time="2025-02-13T15:58:08.708907532Z" level=error msg="Failed to destroy network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.709247 containerd[1564]: time="2025-02-13T15:58:08.709232612Z" level=error msg="encountered an error cleaning up failed sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.709320 containerd[1564]: time="2025-02-13T15:58:08.709308005Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.709518 kubelet[2920]: E0213 15:58:08.709492 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.709601 kubelet[2920]: E0213 15:58:08.709590 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:08.709651 kubelet[2920]: E0213 15:58:08.709641 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:08.709709 kubelet[2920]: E0213 15:58:08.709697 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jnfjq" podUID="440fcad6-af3c-4b75-a27e-d1a967a963e9" Feb 13 15:58:08.715457 containerd[1564]: time="2025-02-13T15:58:08.715424145Z" level=error msg="Failed to destroy network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.715918 containerd[1564]: time="2025-02-13T15:58:08.715870407Z" level=error msg="encountered an error cleaning up failed sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.715966 containerd[1564]: time="2025-02-13T15:58:08.715914381Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.716081 kubelet[2920]: E0213 15:58:08.716054 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:08.716116 kubelet[2920]: E0213 15:58:08.716097 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:08.716116 kubelet[2920]: E0213 15:58:08.716110 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:08.716175 kubelet[2920]: E0213 15:58:08.716135 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" podUID="3fabfa82-f1ab-4e56-bc9e-31febf370fec" Feb 13 15:58:08.867456 containerd[1564]: time="2025-02-13T15:58:08.867280614Z" level=info msg="CreateContainer within sandbox \"541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 15:58:08.923001 systemd[1]: run-netns-cni\x2d1940c426\x2d6eed\x2d0968\x2dcbce\x2de7af9d5bce87.mount: Deactivated successfully. Feb 13 15:58:08.923224 systemd[1]: run-netns-cni\x2d02e13119\x2d6eeb\x2d873c\x2d3f7d\x2d269799a9534f.mount: Deactivated successfully. Feb 13 15:58:08.923306 systemd[1]: run-netns-cni\x2d9b480e72\x2d8e78\x2da71b\x2dd80d\x2d7ff0f04b9b99.mount: Deactivated successfully. Feb 13 15:58:09.070931 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2537950023.mount: Deactivated successfully. Feb 13 15:58:09.083742 containerd[1564]: time="2025-02-13T15:58:09.083672932Z" level=info msg="CreateContainer within sandbox \"541166131e571d9c35c684a2247afcbc202570fa3def196900f533d8472fdce5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f45f27d87c96cc914c08383bf75a8facd90d19509ac6b8e4e05ee1a3e423f199\"" Feb 13 15:58:09.093794 containerd[1564]: time="2025-02-13T15:58:09.093766443Z" level=info msg="StartContainer for \"f45f27d87c96cc914c08383bf75a8facd90d19509ac6b8e4e05ee1a3e423f199\"" Feb 13 15:58:09.174117 systemd[1]: Started cri-containerd-f45f27d87c96cc914c08383bf75a8facd90d19509ac6b8e4e05ee1a3e423f199.scope - libcontainer container f45f27d87c96cc914c08383bf75a8facd90d19509ac6b8e4e05ee1a3e423f199. Feb 13 15:58:09.197117 containerd[1564]: time="2025-02-13T15:58:09.197034060Z" level=info msg="StartContainer for \"f45f27d87c96cc914c08383bf75a8facd90d19509ac6b8e4e05ee1a3e423f199\" returns successfully" Feb 13 15:58:09.458065 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 15:58:09.461005 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 15:58:09.475843 kubelet[2920]: I0213 15:58:09.475814 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02" Feb 13 15:58:09.477203 containerd[1564]: time="2025-02-13T15:58:09.476417792Z" level=info msg="StopPodSandbox for \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\"" Feb 13 15:58:09.477203 containerd[1564]: time="2025-02-13T15:58:09.476534314Z" level=info msg="Ensure that sandbox 1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02 in task-service has been cleanup successfully" Feb 13 15:58:09.477203 containerd[1564]: time="2025-02-13T15:58:09.477191509Z" level=info msg="StopPodSandbox for \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\"" Feb 13 15:58:09.477312 kubelet[2920]: I0213 15:58:09.476989 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380" Feb 13 15:58:09.477336 containerd[1564]: time="2025-02-13T15:58:09.477281231Z" level=info msg="Ensure that sandbox d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380 in task-service has been cleanup successfully" Feb 13 15:58:09.477462 containerd[1564]: time="2025-02-13T15:58:09.477409925Z" level=info msg="TearDown network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" successfully" Feb 13 15:58:09.477462 containerd[1564]: time="2025-02-13T15:58:09.477420582Z" level=info msg="StopPodSandbox for \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" returns successfully" Feb 13 15:58:09.477462 containerd[1564]: time="2025-02-13T15:58:09.477432888Z" level=info msg="TearDown network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" successfully" Feb 13 15:58:09.477462 containerd[1564]: time="2025-02-13T15:58:09.477440972Z" level=info msg="StopPodSandbox for \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" returns successfully" Feb 13 15:58:09.478028 containerd[1564]: time="2025-02-13T15:58:09.477670948Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\"" Feb 13 15:58:09.478028 containerd[1564]: time="2025-02-13T15:58:09.477712874Z" level=info msg="TearDown network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" successfully" Feb 13 15:58:09.478028 containerd[1564]: time="2025-02-13T15:58:09.477719775Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" returns successfully" Feb 13 15:58:09.478028 containerd[1564]: time="2025-02-13T15:58:09.477715172Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\"" Feb 13 15:58:09.478028 containerd[1564]: time="2025-02-13T15:58:09.477779268Z" level=info msg="TearDown network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" successfully" Feb 13 15:58:09.478028 containerd[1564]: time="2025-02-13T15:58:09.477784668Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" returns successfully" Feb 13 15:58:09.479333 containerd[1564]: time="2025-02-13T15:58:09.479066274Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" Feb 13 15:58:09.479333 containerd[1564]: time="2025-02-13T15:58:09.479129353Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" Feb 13 15:58:09.479333 containerd[1564]: time="2025-02-13T15:58:09.479192194Z" level=info msg="TearDown network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" successfully" Feb 13 15:58:09.479333 containerd[1564]: time="2025-02-13T15:58:09.479200072Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" returns successfully" Feb 13 15:58:09.479333 containerd[1564]: time="2025-02-13T15:58:09.479131456Z" level=info msg="TearDown network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" successfully" Feb 13 15:58:09.479333 containerd[1564]: time="2025-02-13T15:58:09.479223950Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" returns successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.479683878Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.479725115Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.479732864Z" level=info msg="TearDown network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.479739195Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" returns successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.479762520Z" level=info msg="TearDown network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.479771894Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" returns successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480060646Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480094506Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480101908Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480111231Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480131207Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480137510Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480393375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:5,}" Feb 13 15:58:09.483625 containerd[1564]: time="2025-02-13T15:58:09.480526244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:5,}" Feb 13 15:58:09.508896 kubelet[2920]: I0213 15:58:09.508575 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a" Feb 13 15:58:09.509176 containerd[1564]: time="2025-02-13T15:58:09.509018571Z" level=info msg="StopPodSandbox for \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\"" Feb 13 15:58:09.509176 containerd[1564]: time="2025-02-13T15:58:09.509151277Z" level=info msg="Ensure that sandbox 5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a in task-service has been cleanup successfully" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.509361434Z" level=info msg="TearDown network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" successfully" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.509383915Z" level=info msg="StopPodSandbox for \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" returns successfully" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.509722946Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\"" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.509763044Z" level=info msg="TearDown network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" successfully" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.509786170Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" returns successfully" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.510027104Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.510065734Z" level=info msg="TearDown network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" successfully" Feb 13 15:58:09.510282 containerd[1564]: time="2025-02-13T15:58:09.510112562Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" returns successfully" Feb 13 15:58:09.513773 containerd[1564]: time="2025-02-13T15:58:09.510318252Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:09.513773 containerd[1564]: time="2025-02-13T15:58:09.510383564Z" level=info msg="TearDown network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" successfully" Feb 13 15:58:09.513773 containerd[1564]: time="2025-02-13T15:58:09.510391114Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" returns successfully" Feb 13 15:58:09.513773 containerd[1564]: time="2025-02-13T15:58:09.510658481Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:09.513773 containerd[1564]: time="2025-02-13T15:58:09.510711450Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:09.513773 containerd[1564]: time="2025-02-13T15:58:09.510719095Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:09.513773 containerd[1564]: time="2025-02-13T15:58:09.512801297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:5,}" Feb 13 15:58:09.514821 kubelet[2920]: I0213 15:58:09.514685 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2" Feb 13 15:58:09.516890 containerd[1564]: time="2025-02-13T15:58:09.516022235Z" level=info msg="StopPodSandbox for \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\"" Feb 13 15:58:09.516890 containerd[1564]: time="2025-02-13T15:58:09.516147863Z" level=info msg="Ensure that sandbox 41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2 in task-service has been cleanup successfully" Feb 13 15:58:09.516890 containerd[1564]: time="2025-02-13T15:58:09.516478399Z" level=info msg="TearDown network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" successfully" Feb 13 15:58:09.516890 containerd[1564]: time="2025-02-13T15:58:09.516487740Z" level=info msg="StopPodSandbox for \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" returns successfully" Feb 13 15:58:09.523602 containerd[1564]: time="2025-02-13T15:58:09.523583016Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\"" Feb 13 15:58:09.525971 containerd[1564]: time="2025-02-13T15:58:09.525888994Z" level=info msg="TearDown network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" successfully" Feb 13 15:58:09.526114 containerd[1564]: time="2025-02-13T15:58:09.525905355Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" returns successfully" Feb 13 15:58:09.526810 containerd[1564]: time="2025-02-13T15:58:09.526791355Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" Feb 13 15:58:09.527633 containerd[1564]: time="2025-02-13T15:58:09.527205701Z" level=info msg="TearDown network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" successfully" Feb 13 15:58:09.527633 containerd[1564]: time="2025-02-13T15:58:09.527218114Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" returns successfully" Feb 13 15:58:09.545867 containerd[1564]: time="2025-02-13T15:58:09.545829707Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:09.547328 containerd[1564]: time="2025-02-13T15:58:09.547097894Z" level=info msg="TearDown network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" successfully" Feb 13 15:58:09.547328 containerd[1564]: time="2025-02-13T15:58:09.547108674Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" returns successfully" Feb 13 15:58:09.559391 containerd[1564]: time="2025-02-13T15:58:09.558603200Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:09.559391 containerd[1564]: time="2025-02-13T15:58:09.558681192Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:09.559391 containerd[1564]: time="2025-02-13T15:58:09.558690321Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:09.562209 containerd[1564]: time="2025-02-13T15:58:09.562180212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:5,}" Feb 13 15:58:09.594009 containerd[1564]: time="2025-02-13T15:58:09.593979704Z" level=error msg="Failed to destroy network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:09.594226 containerd[1564]: time="2025-02-13T15:58:09.594208418Z" level=error msg="encountered an error cleaning up failed sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:09.594273 containerd[1564]: time="2025-02-13T15:58:09.594248342Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:09.606571 kubelet[2920]: E0213 15:58:09.606148 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:09.606571 kubelet[2920]: E0213 15:58:09.606177 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:09.606571 kubelet[2920]: E0213 15:58:09.606192 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-jnfjq" Feb 13 15:58:09.606693 kubelet[2920]: E0213 15:58:09.606216 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-jnfjq_kube-system(440fcad6-af3c-4b75-a27e-d1a967a963e9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-jnfjq" podUID="440fcad6-af3c-4b75-a27e-d1a967a963e9" Feb 13 15:58:09.616980 kubelet[2920]: I0213 15:58:09.616508 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a" Feb 13 15:58:09.618276 containerd[1564]: time="2025-02-13T15:58:09.618245704Z" level=info msg="StopPodSandbox for \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\"" Feb 13 15:58:09.618994 containerd[1564]: time="2025-02-13T15:58:09.618589956Z" level=info msg="Ensure that sandbox e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a in task-service has been cleanup successfully" Feb 13 15:58:09.621029 containerd[1564]: time="2025-02-13T15:58:09.619120416Z" level=info msg="TearDown network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\" successfully" Feb 13 15:58:09.621029 containerd[1564]: time="2025-02-13T15:58:09.619130729Z" level=info msg="StopPodSandbox for \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\" returns successfully" Feb 13 15:58:09.621135 containerd[1564]: time="2025-02-13T15:58:09.621117096Z" level=info msg="StopPodSandbox for \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\"" Feb 13 15:58:09.621295 containerd[1564]: time="2025-02-13T15:58:09.621177248Z" level=info msg="TearDown network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" successfully" Feb 13 15:58:09.621346 containerd[1564]: time="2025-02-13T15:58:09.621332424Z" level=info msg="StopPodSandbox for \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" returns successfully" Feb 13 15:58:09.621757 containerd[1564]: time="2025-02-13T15:58:09.621743774Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\"" Feb 13 15:58:09.622872 containerd[1564]: time="2025-02-13T15:58:09.622789670Z" level=info msg="TearDown network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" successfully" Feb 13 15:58:09.622872 containerd[1564]: time="2025-02-13T15:58:09.622868757Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" returns successfully" Feb 13 15:58:09.623358 containerd[1564]: time="2025-02-13T15:58:09.623317635Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" Feb 13 15:58:09.623395 containerd[1564]: time="2025-02-13T15:58:09.623389525Z" level=info msg="TearDown network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" successfully" Feb 13 15:58:09.623413 containerd[1564]: time="2025-02-13T15:58:09.623396489Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" returns successfully" Feb 13 15:58:09.624719 containerd[1564]: time="2025-02-13T15:58:09.624693327Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:09.624978 containerd[1564]: time="2025-02-13T15:58:09.624944691Z" level=info msg="TearDown network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" successfully" Feb 13 15:58:09.624978 containerd[1564]: time="2025-02-13T15:58:09.624968213Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" returns successfully" Feb 13 15:58:09.628687 containerd[1564]: time="2025-02-13T15:58:09.628132949Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:5,}" Feb 13 15:58:09.630758 kubelet[2920]: I0213 15:58:09.630265 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b" Feb 13 15:58:09.631055 containerd[1564]: time="2025-02-13T15:58:09.630782856Z" level=info msg="StopPodSandbox for \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\"" Feb 13 15:58:09.632323 containerd[1564]: time="2025-02-13T15:58:09.631261384Z" level=info msg="Ensure that sandbox bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b in task-service has been cleanup successfully" Feb 13 15:58:09.636978 containerd[1564]: time="2025-02-13T15:58:09.636930259Z" level=info msg="TearDown network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\" successfully" Feb 13 15:58:09.637241 containerd[1564]: time="2025-02-13T15:58:09.637199705Z" level=info msg="StopPodSandbox for \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\" returns successfully" Feb 13 15:58:09.643621 containerd[1564]: time="2025-02-13T15:58:09.642942090Z" level=info msg="StopPodSandbox for \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\"" Feb 13 15:58:09.643621 containerd[1564]: time="2025-02-13T15:58:09.643263475Z" level=info msg="TearDown network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" successfully" Feb 13 15:58:09.643621 containerd[1564]: time="2025-02-13T15:58:09.643272827Z" level=info msg="StopPodSandbox for \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" returns successfully" Feb 13 15:58:09.644808 containerd[1564]: time="2025-02-13T15:58:09.643907569Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\"" Feb 13 15:58:09.644808 containerd[1564]: time="2025-02-13T15:58:09.644170163Z" level=info msg="TearDown network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" successfully" Feb 13 15:58:09.644808 containerd[1564]: time="2025-02-13T15:58:09.644178326Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" returns successfully" Feb 13 15:58:09.644808 containerd[1564]: time="2025-02-13T15:58:09.644469116Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" Feb 13 15:58:09.646153 containerd[1564]: time="2025-02-13T15:58:09.646141241Z" level=info msg="TearDown network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" successfully" Feb 13 15:58:09.646232 containerd[1564]: time="2025-02-13T15:58:09.646223146Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" returns successfully" Feb 13 15:58:09.648649 containerd[1564]: time="2025-02-13T15:58:09.648538598Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:09.649252 containerd[1564]: time="2025-02-13T15:58:09.648736693Z" level=info msg="TearDown network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" successfully" Feb 13 15:58:09.649252 containerd[1564]: time="2025-02-13T15:58:09.648745947Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" returns successfully" Feb 13 15:58:09.651380 containerd[1564]: time="2025-02-13T15:58:09.650983000Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:5,}" Feb 13 15:58:09.663960 kubelet[2920]: I0213 15:58:09.660450 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-pwzpk" podStartSLOduration=3.831803122 podStartE2EDuration="18.646893554s" podCreationTimestamp="2025-02-13 15:57:51 +0000 UTC" firstStartedPulling="2025-02-13 15:57:53.887291552 +0000 UTC m=+22.971991525" lastFinishedPulling="2025-02-13 15:58:08.702381981 +0000 UTC m=+37.787081957" observedRunningTime="2025-02-13 15:58:09.624292813 +0000 UTC m=+38.708992794" watchObservedRunningTime="2025-02-13 15:58:09.646893554 +0000 UTC m=+38.731593537" Feb 13 15:58:09.924792 systemd[1]: run-netns-cni\x2df8055a7b\x2dd1a4\x2d2dee\x2dde99\x2daf22682c2c14.mount: Deactivated successfully. Feb 13 15:58:09.925124 systemd[1]: run-netns-cni\x2d9cf045b0\x2dd2f9\x2dc3d6\x2ddebb\x2da2d1d80192b7.mount: Deactivated successfully. Feb 13 15:58:09.925240 systemd[1]: run-netns-cni\x2dde7720c4\x2db4da\x2d2b6f\x2dfd13\x2dad807a832c4b.mount: Deactivated successfully. Feb 13 15:58:09.925339 systemd[1]: run-netns-cni\x2db821d5ff\x2dc934\x2dc67f\x2d3d02\x2d9d2bda4a1d2b.mount: Deactivated successfully. Feb 13 15:58:09.925436 systemd[1]: run-netns-cni\x2d6889d9ca\x2d63f2\x2d5f31\x2d22aa\x2d3eedc278e745.mount: Deactivated successfully. Feb 13 15:58:09.925527 systemd[1]: run-netns-cni\x2d7a79a658\x2d981a\x2dd365\x2d9773\x2df8a0999ae1d9.mount: Deactivated successfully. Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:09.808 [INFO][4813] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:09.808 [INFO][4813] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" iface="eth0" netns="/var/run/netns/cni-b36c3d92-4d1d-38fd-0b0a-c34355de6741" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:09.809 [INFO][4813] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" iface="eth0" netns="/var/run/netns/cni-b36c3d92-4d1d-38fd-0b0a-c34355de6741" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:09.822 [INFO][4813] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" iface="eth0" netns="/var/run/netns/cni-b36c3d92-4d1d-38fd-0b0a-c34355de6741" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:09.822 [INFO][4813] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:09.822 [INFO][4813] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:10.333 [INFO][4892] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" HandleID="k8s-pod-network.d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" Workload="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:10.334 [INFO][4892] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:10.334 [INFO][4892] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:10.342 [WARNING][4892] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" HandleID="k8s-pod-network.d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" Workload="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:10.342 [INFO][4892] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" HandleID="k8s-pod-network.d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" Workload="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:10.343 [INFO][4892] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:10.350192 containerd[1564]: 2025-02-13 15:58:10.348 [INFO][4813] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4" Feb 13 15:58:10.352027 systemd[1]: run-netns-cni\x2db36c3d92\x2d4d1d\x2d38fd\x2d0b0a\x2dc34355de6741.mount: Deactivated successfully. Feb 13 15:58:10.354189 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4-shm.mount: Deactivated successfully. Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:09.832 [INFO][4829] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:09.832 [INFO][4829] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" iface="eth0" netns="/var/run/netns/cni-8d1dbffe-38b5-2a87-f634-5575716d3f09" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:09.832 [INFO][4829] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" iface="eth0" netns="/var/run/netns/cni-8d1dbffe-38b5-2a87-f634-5575716d3f09" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:09.833 [INFO][4829] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" iface="eth0" netns="/var/run/netns/cni-8d1dbffe-38b5-2a87-f634-5575716d3f09" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:09.833 [INFO][4829] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:09.833 [INFO][4829] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:10.333 [INFO][4894] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" HandleID="k8s-pod-network.5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" Workload="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:10.334 [INFO][4894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:10.343 [INFO][4894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:10.349 [WARNING][4894] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" HandleID="k8s-pod-network.5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" Workload="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:10.349 [INFO][4894] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" HandleID="k8s-pod-network.5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" Workload="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:10.352 [INFO][4894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:10.363208 containerd[1564]: 2025-02-13 15:58:10.359 [INFO][4829] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde" Feb 13 15:58:10.365064 systemd[1]: run-netns-cni\x2d8d1dbffe\x2d38b5\x2d2a87\x2df634\x2d5575716d3f09.mount: Deactivated successfully. Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:09.827 [INFO][4801] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:09.827 [INFO][4801] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" iface="eth0" netns="/var/run/netns/cni-f4034ba6-c7c8-17e7-47ab-c145aab746eb" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:09.828 [INFO][4801] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" iface="eth0" netns="/var/run/netns/cni-f4034ba6-c7c8-17e7-47ab-c145aab746eb" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:09.828 [INFO][4801] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" iface="eth0" netns="/var/run/netns/cni-f4034ba6-c7c8-17e7-47ab-c145aab746eb" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:09.828 [INFO][4801] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:09.828 [INFO][4801] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:10.334 [INFO][4893] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" HandleID="k8s-pod-network.8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" Workload="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:10.334 [INFO][4893] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:10.353 [INFO][4893] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:10.359 [WARNING][4893] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" HandleID="k8s-pod-network.8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" Workload="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:10.359 [INFO][4893] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" HandleID="k8s-pod-network.8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" Workload="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:10.361 [INFO][4893] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:10.367907 containerd[1564]: 2025-02-13 15:58:10.363 [INFO][4801] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb" Feb 13 15:58:10.367915 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde-shm.mount: Deactivated successfully. Feb 13 15:58:10.369468 containerd[1564]: time="2025-02-13T15:58:10.369342216Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:10.369716 kubelet[2920]: E0213 15:58:10.369684 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:10.369876 kubelet[2920]: E0213 15:58:10.369810 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:10.369876 kubelet[2920]: E0213 15:58:10.369834 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" Feb 13 15:58:10.370121 kubelet[2920]: E0213 15:58:10.369967 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5ddbbb799c-mbdnx_calico-system(3fabfa82-f1ab-4e56-bc9e-31febf370fec)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5628fe08bbec2a9618208f8850af880170a8528a62a941377ee966e00e9fddde\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" podUID="3fabfa82-f1ab-4e56-bc9e-31febf370fec" Feb 13 15:58:10.370583 containerd[1564]: time="2025-02-13T15:58:10.370178057Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:10.370855 containerd[1564]: time="2025-02-13T15:58:10.370786795Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:10.371067 kubelet[2920]: E0213 15:58:10.370970 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:10.371067 kubelet[2920]: E0213 15:58:10.370970 2920 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 15:58:10.371067 kubelet[2920]: E0213 15:58:10.370993 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:10.371067 kubelet[2920]: E0213 15:58:10.371007 2920 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:10.371191 kubelet[2920]: E0213 15:58:10.371011 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" Feb 13 15:58:10.371191 kubelet[2920]: E0213 15:58:10.371019 2920 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" Feb 13 15:58:10.371191 kubelet[2920]: E0213 15:58:10.371030 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-nz7vq_calico-apiserver(d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d84e9f8daa30abad1b7ac22a0dba4b8a9244313f115fea3f7a06ab1ea0a016e4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" podUID="d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4" Feb 13 15:58:10.371583 kubelet[2920]: E0213 15:58:10.371042 2920 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-8dd87f54d-92htb_calico-apiserver(01cdc21b-7e9c-4a56-99bc-b4069f009602)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" podUID="01cdc21b-7e9c-4a56-99bc-b4069f009602" Feb 13 15:58:10.406314 systemd-networkd[1473]: cali71ef68f27cc: Link UP Feb 13 15:58:10.406425 systemd-networkd[1473]: cali71ef68f27cc: Gained carrier Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:09.713 [INFO][4842] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:09.796 [INFO][4842] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--fz6cz-eth0 csi-node-driver- calico-system b126212a-e016-4060-9fc3-97a9a5142c06 592 0 2025-02-13 15:57:51 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-fz6cz eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali71ef68f27cc [] []}} ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:09.796 [INFO][4842] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-eth0" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.333 [INFO][4897] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" HandleID="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Workload="localhost-k8s-csi--node--driver--fz6cz-eth0" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.349 [INFO][4897] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" HandleID="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Workload="localhost-k8s-csi--node--driver--fz6cz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000104c50), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-fz6cz", "timestamp":"2025-02-13 15:58:10.333245348 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.349 [INFO][4897] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.364 [INFO][4897] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.365 [INFO][4897] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.371 [INFO][4897] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.381 [INFO][4897] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.384 [INFO][4897] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.384 [INFO][4897] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.385 [INFO][4897] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.386 [INFO][4897] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.386 [INFO][4897] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.388 [INFO][4897] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.391 [INFO][4897] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.391 [INFO][4897] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" host="localhost" Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.391 [INFO][4897] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:10.426584 containerd[1564]: 2025-02-13 15:58:10.391 [INFO][4897] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" HandleID="k8s-pod-network.dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Workload="localhost-k8s-csi--node--driver--fz6cz-eth0" Feb 13 15:58:10.428206 containerd[1564]: 2025-02-13 15:58:10.394 [INFO][4842] cni-plugin/k8s.go 386: Populated endpoint ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fz6cz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b126212a-e016-4060-9fc3-97a9a5142c06", ResourceVersion:"592", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-fz6cz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali71ef68f27cc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:10.428206 containerd[1564]: 2025-02-13 15:58:10.394 [INFO][4842] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-eth0" Feb 13 15:58:10.428206 containerd[1564]: 2025-02-13 15:58:10.394 [INFO][4842] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali71ef68f27cc ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-eth0" Feb 13 15:58:10.428206 containerd[1564]: 2025-02-13 15:58:10.407 [INFO][4842] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-eth0" Feb 13 15:58:10.428206 containerd[1564]: 2025-02-13 15:58:10.408 [INFO][4842] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fz6cz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b126212a-e016-4060-9fc3-97a9a5142c06", ResourceVersion:"592", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c", Pod:"csi-node-driver-fz6cz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali71ef68f27cc", MAC:"32:ae:f3:69:ca:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:10.428206 containerd[1564]: 2025-02-13 15:58:10.421 [INFO][4842] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c" Namespace="calico-system" Pod="csi-node-driver-fz6cz" WorkloadEndpoint="localhost-k8s-csi--node--driver--fz6cz-eth0" Feb 13 15:58:10.431755 systemd-networkd[1473]: cali266e4161af9: Link UP Feb 13 15:58:10.432203 systemd-networkd[1473]: cali266e4161af9: Gained carrier Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:09.722 [INFO][4855] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:09.794 [INFO][4855] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0 coredns-7db6d8ff4d- kube-system a494df23-70e9-451c-8266-c2382d1a2d64 681 0 2025-02-13 15:57:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-c8kpg eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali266e4161af9 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:09.794 [INFO][4855] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.333 [INFO][4896] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" HandleID="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Workload="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.349 [INFO][4896] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" HandleID="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Workload="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000103e90), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-c8kpg", "timestamp":"2025-02-13 15:58:10.333327707 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.349 [INFO][4896] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.392 [INFO][4896] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.392 [INFO][4896] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.393 [INFO][4896] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.396 [INFO][4896] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.399 [INFO][4896] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.401 [INFO][4896] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.409 [INFO][4896] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.410 [INFO][4896] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.411 [INFO][4896] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796 Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.414 [INFO][4896] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.426 [INFO][4896] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.426 [INFO][4896] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" host="localhost" Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.426 [INFO][4896] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:10.444382 containerd[1564]: 2025-02-13 15:58:10.426 [INFO][4896] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" HandleID="k8s-pod-network.7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Workload="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" Feb 13 15:58:10.448523 containerd[1564]: 2025-02-13 15:58:10.430 [INFO][4855] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"a494df23-70e9-451c-8266-c2382d1a2d64", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-c8kpg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali266e4161af9", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:10.448523 containerd[1564]: 2025-02-13 15:58:10.430 [INFO][4855] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" Feb 13 15:58:10.448523 containerd[1564]: 2025-02-13 15:58:10.430 [INFO][4855] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali266e4161af9 ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" Feb 13 15:58:10.448523 containerd[1564]: 2025-02-13 15:58:10.432 [INFO][4855] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" Feb 13 15:58:10.448523 containerd[1564]: 2025-02-13 15:58:10.432 [INFO][4855] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"a494df23-70e9-451c-8266-c2382d1a2d64", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796", Pod:"coredns-7db6d8ff4d-c8kpg", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali266e4161af9", MAC:"5a:4e:db:4a:1a:93", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:10.448523 containerd[1564]: 2025-02-13 15:58:10.442 [INFO][4855] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796" Namespace="kube-system" Pod="coredns-7db6d8ff4d-c8kpg" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--c8kpg-eth0" Feb 13 15:58:10.456744 containerd[1564]: time="2025-02-13T15:58:10.455093376Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:10.456744 containerd[1564]: time="2025-02-13T15:58:10.455320601Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:10.456744 containerd[1564]: time="2025-02-13T15:58:10.455332426Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:10.456744 containerd[1564]: time="2025-02-13T15:58:10.455379336Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:10.472655 systemd[1]: Started cri-containerd-dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c.scope - libcontainer container dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c. Feb 13 15:58:10.477061 containerd[1564]: time="2025-02-13T15:58:10.476922908Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:10.477061 containerd[1564]: time="2025-02-13T15:58:10.477028613Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:10.477061 containerd[1564]: time="2025-02-13T15:58:10.477041057Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:10.477654 containerd[1564]: time="2025-02-13T15:58:10.477583554Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:10.489642 systemd[1]: Started cri-containerd-7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796.scope - libcontainer container 7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796. Feb 13 15:58:10.525632 containerd[1564]: time="2025-02-13T15:58:10.505264664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fz6cz,Uid:b126212a-e016-4060-9fc3-97a9a5142c06,Namespace:calico-system,Attempt:5,} returns sandbox id \"dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c\"" Feb 13 15:58:10.525632 containerd[1564]: time="2025-02-13T15:58:10.507228238Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 15:58:10.525632 containerd[1564]: time="2025-02-13T15:58:10.523216376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-c8kpg,Uid:a494df23-70e9-451c-8266-c2382d1a2d64,Namespace:kube-system,Attempt:5,} returns sandbox id \"7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796\"" Feb 13 15:58:10.494930 systemd-resolved[1476]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 15:58:10.499013 systemd-resolved[1476]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 15:58:10.553184 containerd[1564]: time="2025-02-13T15:58:10.553158015Z" level=info msg="CreateContainer within sandbox \"7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 15:58:10.574151 containerd[1564]: time="2025-02-13T15:58:10.574091366Z" level=info msg="CreateContainer within sandbox \"7d69252adb0423c285beab96ea8df4b714c5a42efc10d80388494fb610177796\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"a40ad01e10087a08c42c2213222e55d4ea51a32e2eec008ea0e6061eaa7cc45c\"" Feb 13 15:58:10.575088 containerd[1564]: time="2025-02-13T15:58:10.574482335Z" level=info msg="StartContainer for \"a40ad01e10087a08c42c2213222e55d4ea51a32e2eec008ea0e6061eaa7cc45c\"" Feb 13 15:58:10.595337 systemd[1]: Started cri-containerd-a40ad01e10087a08c42c2213222e55d4ea51a32e2eec008ea0e6061eaa7cc45c.scope - libcontainer container a40ad01e10087a08c42c2213222e55d4ea51a32e2eec008ea0e6061eaa7cc45c. Feb 13 15:58:10.630738 kubelet[2920]: I0213 15:58:10.630666 2920 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 15:58:10.637671 containerd[1564]: time="2025-02-13T15:58:10.637629969Z" level=info msg="StartContainer for \"a40ad01e10087a08c42c2213222e55d4ea51a32e2eec008ea0e6061eaa7cc45c\" returns successfully" Feb 13 15:58:10.649201 kubelet[2920]: I0213 15:58:10.649179 2920 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e" Feb 13 15:58:10.649546 containerd[1564]: time="2025-02-13T15:58:10.649522118Z" level=info msg="StopPodSandbox for \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\"" Feb 13 15:58:10.649685 containerd[1564]: time="2025-02-13T15:58:10.649669026Z" level=info msg="Ensure that sandbox d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e in task-service has been cleanup successfully" Feb 13 15:58:10.649974 containerd[1564]: time="2025-02-13T15:58:10.649954418Z" level=info msg="StopPodSandbox for \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\"" Feb 13 15:58:10.650044 containerd[1564]: time="2025-02-13T15:58:10.650022575Z" level=info msg="TearDown network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" successfully" Feb 13 15:58:10.650044 containerd[1564]: time="2025-02-13T15:58:10.650032528Z" level=info msg="StopPodSandbox for \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" returns successfully" Feb 13 15:58:10.650127 containerd[1564]: time="2025-02-13T15:58:10.650114331Z" level=info msg="StopPodSandbox for \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\"" Feb 13 15:58:10.650161 containerd[1564]: time="2025-02-13T15:58:10.650148731Z" level=info msg="TearDown network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" successfully" Feb 13 15:58:10.650161 containerd[1564]: time="2025-02-13T15:58:10.650158247Z" level=info msg="StopPodSandbox for \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" returns successfully" Feb 13 15:58:10.650285 containerd[1564]: time="2025-02-13T15:58:10.650187582Z" level=info msg="StopPodSandbox for \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\"" Feb 13 15:58:10.650285 containerd[1564]: time="2025-02-13T15:58:10.650281358Z" level=info msg="TearDown network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" successfully" Feb 13 15:58:10.650322 containerd[1564]: time="2025-02-13T15:58:10.650287894Z" level=info msg="StopPodSandbox for \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" returns successfully" Feb 13 15:58:10.650506 containerd[1564]: time="2025-02-13T15:58:10.650492807Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\"" Feb 13 15:58:10.650567 containerd[1564]: time="2025-02-13T15:58:10.650543284Z" level=info msg="TearDown network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" successfully" Feb 13 15:58:10.650567 containerd[1564]: time="2025-02-13T15:58:10.650561492Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" returns successfully" Feb 13 15:58:10.650610 containerd[1564]: time="2025-02-13T15:58:10.650583235Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\"" Feb 13 15:58:10.650626 containerd[1564]: time="2025-02-13T15:58:10.650618119Z" level=info msg="TearDown network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" successfully" Feb 13 15:58:10.650626 containerd[1564]: time="2025-02-13T15:58:10.650623428Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" returns successfully" Feb 13 15:58:10.650723 containerd[1564]: time="2025-02-13T15:58:10.650700545Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\"" Feb 13 15:58:10.650759 containerd[1564]: time="2025-02-13T15:58:10.650747761Z" level=info msg="TearDown network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" successfully" Feb 13 15:58:10.650759 containerd[1564]: time="2025-02-13T15:58:10.650756912Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.650793023Z" level=info msg="TearDown network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.650799015Z" level=info msg="StopPodSandbox for \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.650947988Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.650992613Z" level=info msg="TearDown network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.650999509Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651095137Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651151885Z" level=info msg="TearDown network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651159381Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651183523Z" level=info msg="StopPodSandbox for \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651212899Z" level=info msg="TearDown network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651228960Z" level=info msg="StopPodSandbox for \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651250526Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651280429Z" level=info msg="TearDown network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651285672Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651339054Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651380521Z" level=info msg="TearDown network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651386680Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651464140Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651496999Z" level=info msg="TearDown network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651502294Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651534618Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651586008Z" level=info msg="TearDown network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651592507Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651612218Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651651043Z" level=info msg="TearDown network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651658441Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651701978Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651744138Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651750905Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651814545Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651845356Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651850436Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651883629Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651912868Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651917919Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.651936185Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.653288980Z" level=info msg="TearDown network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" successfully" Feb 13 15:58:10.655648 containerd[1564]: time="2025-02-13T15:58:10.653296773Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" returns successfully" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.653383031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:5,}" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.653762238Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:5,}" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.653869823Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.653908606Z" level=info msg="TearDown network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" successfully" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.653915254Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" returns successfully" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.653949153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:5,}" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.654127038Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.654178636Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.654184935Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:10.656265 containerd[1564]: time="2025-02-13T15:58:10.654390716Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:6,}" Feb 13 15:58:10.665484 kubelet[2920]: I0213 15:58:10.665447 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-c8kpg" podStartSLOduration=24.665436662 podStartE2EDuration="24.665436662s" podCreationTimestamp="2025-02-13 15:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 15:58:10.66495846 +0000 UTC m=+39.749658436" watchObservedRunningTime="2025-02-13 15:58:10.665436662 +0000 UTC m=+39.750136639" Feb 13 15:58:10.919890 systemd[1]: run-netns-cni\x2df4034ba6\x2dc7c8\x2d17e7\x2d47ab\x2dc145aab746eb.mount: Deactivated successfully. Feb 13 15:58:10.919946 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8027051d1213f08d5c360693485ac818b3e29c7e34f75d963ecc147c7288bfeb-shm.mount: Deactivated successfully. Feb 13 15:58:10.919986 systemd[1]: run-netns-cni\x2dcc351f53\x2d85f4\x2df144\x2d5d4e\x2df45f47ca6b82.mount: Deactivated successfully. Feb 13 15:58:11.496879 systemd-networkd[1473]: caliac34cc288c1: Link UP Feb 13 15:58:11.497002 systemd-networkd[1473]: caliac34cc288c1: Gained carrier Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.381 [INFO][5190] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.399 [INFO][5190] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0 calico-apiserver-8dd87f54d- calico-apiserver d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4 766 0 2025-02-13 15:57:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8dd87f54d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8dd87f54d-nz7vq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliac34cc288c1 [] []}} ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.399 [INFO][5190] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.453 [INFO][5244] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" HandleID="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Workload="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.463 [INFO][5244] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" HandleID="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Workload="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003197a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8dd87f54d-nz7vq", "timestamp":"2025-02-13 15:58:11.45361978 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.463 [INFO][5244] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.463 [INFO][5244] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.463 [INFO][5244] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.466 [INFO][5244] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.471 [INFO][5244] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.475 [INFO][5244] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.477 [INFO][5244] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.480 [INFO][5244] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.480 [INFO][5244] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.481 [INFO][5244] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.486 [INFO][5244] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.492 [INFO][5244] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.492 [INFO][5244] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" host="localhost" Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.492 [INFO][5244] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:11.520878 containerd[1564]: 2025-02-13 15:58:11.492 [INFO][5244] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" HandleID="k8s-pod-network.3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Workload="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:11.522990 containerd[1564]: 2025-02-13 15:58:11.494 [INFO][5190] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0", GenerateName:"calico-apiserver-8dd87f54d-", Namespace:"calico-apiserver", SelfLink:"", UID:"d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4", ResourceVersion:"766", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dd87f54d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8dd87f54d-nz7vq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliac34cc288c1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.522990 containerd[1564]: 2025-02-13 15:58:11.494 [INFO][5190] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:11.522990 containerd[1564]: 2025-02-13 15:58:11.494 [INFO][5190] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliac34cc288c1 ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:11.522990 containerd[1564]: 2025-02-13 15:58:11.496 [INFO][5190] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:11.522990 containerd[1564]: 2025-02-13 15:58:11.496 [INFO][5190] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0", GenerateName:"calico-apiserver-8dd87f54d-", Namespace:"calico-apiserver", SelfLink:"", UID:"d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4", ResourceVersion:"766", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dd87f54d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d", Pod:"calico-apiserver-8dd87f54d-nz7vq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliac34cc288c1", MAC:"de:3e:ee:51:b0:e2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.522990 containerd[1564]: 2025-02-13 15:58:11.516 [INFO][5190] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-nz7vq" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--nz7vq-eth0" Feb 13 15:58:11.546994 containerd[1564]: time="2025-02-13T15:58:11.545819180Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:11.546994 containerd[1564]: time="2025-02-13T15:58:11.546601993Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:11.546994 containerd[1564]: time="2025-02-13T15:58:11.546611250Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.552367 containerd[1564]: time="2025-02-13T15:58:11.547156753Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.555803 systemd-networkd[1473]: cali602f69aac35: Link UP Feb 13 15:58:11.556659 systemd-networkd[1473]: cali602f69aac35: Gained carrier Feb 13 15:58:11.580048 systemd[1]: Started cri-containerd-3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d.scope - libcontainer container 3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d. Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.319 [INFO][5193] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.341 [INFO][5193] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0 calico-apiserver-8dd87f54d- calico-apiserver 01cdc21b-7e9c-4a56-99bc-b4069f009602 767 0 2025-02-13 15:57:51 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:8dd87f54d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-8dd87f54d-92htb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali602f69aac35 [] []}} ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.342 [INFO][5193] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.459 [INFO][5226] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" HandleID="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Workload="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.470 [INFO][5226] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" HandleID="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Workload="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001c32b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-8dd87f54d-92htb", "timestamp":"2025-02-13 15:58:11.45999246 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.470 [INFO][5226] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.492 [INFO][5226] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.492 [INFO][5226] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.498 [INFO][5226] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.520 [INFO][5226] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.527 [INFO][5226] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.529 [INFO][5226] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.531 [INFO][5226] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.531 [INFO][5226] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.532 [INFO][5226] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7 Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.536 [INFO][5226] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.547 [INFO][5226] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.547 [INFO][5226] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" host="localhost" Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.547 [INFO][5226] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:11.582979 containerd[1564]: 2025-02-13 15:58:11.547 [INFO][5226] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" HandleID="k8s-pod-network.cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Workload="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:11.584829 containerd[1564]: 2025-02-13 15:58:11.551 [INFO][5193] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0", GenerateName:"calico-apiserver-8dd87f54d-", Namespace:"calico-apiserver", SelfLink:"", UID:"01cdc21b-7e9c-4a56-99bc-b4069f009602", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dd87f54d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-8dd87f54d-92htb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali602f69aac35", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.584829 containerd[1564]: 2025-02-13 15:58:11.552 [INFO][5193] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:11.584829 containerd[1564]: 2025-02-13 15:58:11.552 [INFO][5193] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali602f69aac35 ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:11.584829 containerd[1564]: 2025-02-13 15:58:11.556 [INFO][5193] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:11.584829 containerd[1564]: 2025-02-13 15:58:11.558 [INFO][5193] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0", GenerateName:"calico-apiserver-8dd87f54d-", Namespace:"calico-apiserver", SelfLink:"", UID:"01cdc21b-7e9c-4a56-99bc-b4069f009602", ResourceVersion:"767", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"8dd87f54d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7", Pod:"calico-apiserver-8dd87f54d-92htb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali602f69aac35", MAC:"96:97:bd:86:d0:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.584829 containerd[1564]: 2025-02-13 15:58:11.579 [INFO][5193] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7" Namespace="calico-apiserver" Pod="calico-apiserver-8dd87f54d-92htb" WorkloadEndpoint="localhost-k8s-calico--apiserver--8dd87f54d--92htb-eth0" Feb 13 15:58:11.611461 systemd-networkd[1473]: cali126ee26ea74: Link UP Feb 13 15:58:11.612419 systemd-networkd[1473]: cali126ee26ea74: Gained carrier Feb 13 15:58:11.618764 containerd[1564]: time="2025-02-13T15:58:11.618299699Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:11.618764 containerd[1564]: time="2025-02-13T15:58:11.618419614Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:11.619194 containerd[1564]: time="2025-02-13T15:58:11.618727984Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.621655 containerd[1564]: time="2025-02-13T15:58:11.619840091Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.420 [INFO][5181] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.435 [INFO][5181] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0 calico-kube-controllers-5ddbbb799c- calico-system 3fabfa82-f1ab-4e56-bc9e-31febf370fec 768 0 2025-02-13 15:57:51 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5ddbbb799c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-5ddbbb799c-mbdnx eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali126ee26ea74 [] []}} ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.435 [INFO][5181] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.490 [INFO][5257] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" HandleID="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Workload="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.518 [INFO][5257] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" HandleID="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Workload="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001c2640), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-5ddbbb799c-mbdnx", "timestamp":"2025-02-13 15:58:11.490783012 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.518 [INFO][5257] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.549 [INFO][5257] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.549 [INFO][5257] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.550 [INFO][5257] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.557 [INFO][5257] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.570 [INFO][5257] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.580 [INFO][5257] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.584 [INFO][5257] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.585 [INFO][5257] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.586 [INFO][5257] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74 Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.592 [INFO][5257] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.597 [INFO][5257] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.598 [INFO][5257] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" host="localhost" Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.598 [INFO][5257] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:11.635970 containerd[1564]: 2025-02-13 15:58:11.598 [INFO][5257] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" HandleID="k8s-pod-network.974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Workload="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:11.647511 containerd[1564]: 2025-02-13 15:58:11.605 [INFO][5181] cni-plugin/k8s.go 386: Populated endpoint ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0", GenerateName:"calico-kube-controllers-5ddbbb799c-", Namespace:"calico-system", SelfLink:"", UID:"3fabfa82-f1ab-4e56-bc9e-31febf370fec", ResourceVersion:"768", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5ddbbb799c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-5ddbbb799c-mbdnx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali126ee26ea74", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.647511 containerd[1564]: 2025-02-13 15:58:11.607 [INFO][5181] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:11.647511 containerd[1564]: 2025-02-13 15:58:11.607 [INFO][5181] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali126ee26ea74 ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:11.647511 containerd[1564]: 2025-02-13 15:58:11.614 [INFO][5181] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:11.647511 containerd[1564]: 2025-02-13 15:58:11.617 [INFO][5181] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0", GenerateName:"calico-kube-controllers-5ddbbb799c-", Namespace:"calico-system", SelfLink:"", UID:"3fabfa82-f1ab-4e56-bc9e-31febf370fec", ResourceVersion:"768", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5ddbbb799c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74", Pod:"calico-kube-controllers-5ddbbb799c-mbdnx", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali126ee26ea74", MAC:"b2:83:49:61:44:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.647511 containerd[1564]: 2025-02-13 15:58:11.633 [INFO][5181] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74" Namespace="calico-system" Pod="calico-kube-controllers-5ddbbb799c-mbdnx" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--5ddbbb799c--mbdnx-eth0" Feb 13 15:58:11.641709 systemd[1]: Started cri-containerd-cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7.scope - libcontainer container cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7. Feb 13 15:58:11.665991 systemd-networkd[1473]: cali14e4ffc7966: Link UP Feb 13 15:58:11.669514 systemd-networkd[1473]: cali14e4ffc7966: Gained carrier Feb 13 15:58:11.673310 systemd-resolved[1476]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 15:58:11.675950 systemd-resolved[1476]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.400 [INFO][5213] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.417 [INFO][5213] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0 coredns-7db6d8ff4d- kube-system 440fcad6-af3c-4b75-a27e-d1a967a963e9 683 0 2025-02-13 15:57:46 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-jnfjq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali14e4ffc7966 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.417 [INFO][5213] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.495 [INFO][5252] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" HandleID="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Workload="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.524 [INFO][5252] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" HandleID="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Workload="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0000fd350), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-jnfjq", "timestamp":"2025-02-13 15:58:11.495829614 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.524 [INFO][5252] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.598 [INFO][5252] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.599 [INFO][5252] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.602 [INFO][5252] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.610 [INFO][5252] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.617 [INFO][5252] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.618 [INFO][5252] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.622 [INFO][5252] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.622 [INFO][5252] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.624 [INFO][5252] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000 Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.639 [INFO][5252] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.645 [INFO][5252] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.645 [INFO][5252] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" host="localhost" Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.645 [INFO][5252] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 15:58:11.689588 containerd[1564]: 2025-02-13 15:58:11.646 [INFO][5252] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" HandleID="k8s-pod-network.7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Workload="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" Feb 13 15:58:11.694376 containerd[1564]: 2025-02-13 15:58:11.654 [INFO][5213] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"440fcad6-af3c-4b75-a27e-d1a967a963e9", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-jnfjq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali14e4ffc7966", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.694376 containerd[1564]: 2025-02-13 15:58:11.654 [INFO][5213] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" Feb 13 15:58:11.694376 containerd[1564]: 2025-02-13 15:58:11.654 [INFO][5213] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali14e4ffc7966 ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" Feb 13 15:58:11.694376 containerd[1564]: 2025-02-13 15:58:11.674 [INFO][5213] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" Feb 13 15:58:11.694376 containerd[1564]: 2025-02-13 15:58:11.675 [INFO][5213] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"440fcad6-af3c-4b75-a27e-d1a967a963e9", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 15, 57, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000", Pod:"coredns-7db6d8ff4d-jnfjq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali14e4ffc7966", MAC:"6a:ad:32:2c:6e:00", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 15:58:11.694376 containerd[1564]: 2025-02-13 15:58:11.685 [INFO][5213] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000" Namespace="kube-system" Pod="coredns-7db6d8ff4d-jnfjq" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--jnfjq-eth0" Feb 13 15:58:11.710937 containerd[1564]: time="2025-02-13T15:58:11.710066936Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:11.710937 containerd[1564]: time="2025-02-13T15:58:11.710097581Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:11.710937 containerd[1564]: time="2025-02-13T15:58:11.710104364Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.710937 containerd[1564]: time="2025-02-13T15:58:11.710147626Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.718139 kernel: bpftool[5417]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 15:58:11.740702 systemd[1]: Started cri-containerd-974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74.scope - libcontainer container 974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74. Feb 13 15:58:11.759481 containerd[1564]: time="2025-02-13T15:58:11.758362909Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-92htb,Uid:01cdc21b-7e9c-4a56-99bc-b4069f009602,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7\"" Feb 13 15:58:11.767803 containerd[1564]: time="2025-02-13T15:58:11.767761168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-8dd87f54d-nz7vq,Uid:d8a4eee4-dc8a-4fd7-97cb-aaa8332159e4,Namespace:calico-apiserver,Attempt:5,} returns sandbox id \"3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d\"" Feb 13 15:58:11.771365 containerd[1564]: time="2025-02-13T15:58:11.770835964Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 15:58:11.771603 containerd[1564]: time="2025-02-13T15:58:11.771271849Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 15:58:11.771603 containerd[1564]: time="2025-02-13T15:58:11.771479898Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.771759 containerd[1564]: time="2025-02-13T15:58:11.771609375Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 15:58:11.783978 systemd-resolved[1476]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 15:58:11.790654 systemd[1]: Started cri-containerd-7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000.scope - libcontainer container 7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000. Feb 13 15:58:11.806378 systemd-resolved[1476]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 15:58:11.828745 containerd[1564]: time="2025-02-13T15:58:11.828715371Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ddbbb799c-mbdnx,Uid:3fabfa82-f1ab-4e56-bc9e-31febf370fec,Namespace:calico-system,Attempt:5,} returns sandbox id \"974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74\"" Feb 13 15:58:11.838696 containerd[1564]: time="2025-02-13T15:58:11.838662543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-jnfjq,Uid:440fcad6-af3c-4b75-a27e-d1a967a963e9,Namespace:kube-system,Attempt:6,} returns sandbox id \"7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000\"" Feb 13 15:58:11.842182 systemd-networkd[1473]: cali266e4161af9: Gained IPv6LL Feb 13 15:58:11.874041 containerd[1564]: time="2025-02-13T15:58:11.874008409Z" level=info msg="CreateContainer within sandbox \"7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 15:58:11.882680 containerd[1564]: time="2025-02-13T15:58:11.882646959Z" level=info msg="CreateContainer within sandbox \"7e77ee81270baf7a019d077bf99462fa394fd7a974cbb0a1647af3b39e0e7000\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d874df8ea3bc36829e74969c545136081e7585c6a2e9d071916c906d77281e10\"" Feb 13 15:58:11.883399 containerd[1564]: time="2025-02-13T15:58:11.883144743Z" level=info msg="StartContainer for \"d874df8ea3bc36829e74969c545136081e7585c6a2e9d071916c906d77281e10\"" Feb 13 15:58:11.906812 systemd[1]: Started cri-containerd-d874df8ea3bc36829e74969c545136081e7585c6a2e9d071916c906d77281e10.scope - libcontainer container d874df8ea3bc36829e74969c545136081e7585c6a2e9d071916c906d77281e10. Feb 13 15:58:11.933773 containerd[1564]: time="2025-02-13T15:58:11.933678128Z" level=info msg="StartContainer for \"d874df8ea3bc36829e74969c545136081e7585c6a2e9d071916c906d77281e10\" returns successfully" Feb 13 15:58:12.019644 systemd-networkd[1473]: cali71ef68f27cc: Gained IPv6LL Feb 13 15:58:12.028401 systemd-networkd[1473]: vxlan.calico: Link UP Feb 13 15:58:12.028407 systemd-networkd[1473]: vxlan.calico: Gained carrier Feb 13 15:58:12.383359 containerd[1564]: time="2025-02-13T15:58:12.382575049Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:12.383359 containerd[1564]: time="2025-02-13T15:58:12.383051651Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 15:58:12.383359 containerd[1564]: time="2025-02-13T15:58:12.383138659Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:12.384591 containerd[1564]: time="2025-02-13T15:58:12.384573217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:12.385034 containerd[1564]: time="2025-02-13T15:58:12.385018236Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.877775619s" Feb 13 15:58:12.385067 containerd[1564]: time="2025-02-13T15:58:12.385035240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 15:58:12.386263 containerd[1564]: time="2025-02-13T15:58:12.386250485Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 15:58:12.421636 containerd[1564]: time="2025-02-13T15:58:12.421591098Z" level=info msg="CreateContainer within sandbox \"dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 15:58:12.435680 containerd[1564]: time="2025-02-13T15:58:12.435656575Z" level=info msg="CreateContainer within sandbox \"dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"a71550a52aae30efe79ff07619c27e2b0c4c7151e4fe1bb297d8dbeae14850b4\"" Feb 13 15:58:12.436297 containerd[1564]: time="2025-02-13T15:58:12.436263217Z" level=info msg="StartContainer for \"a71550a52aae30efe79ff07619c27e2b0c4c7151e4fe1bb297d8dbeae14850b4\"" Feb 13 15:58:12.464640 systemd[1]: Started cri-containerd-a71550a52aae30efe79ff07619c27e2b0c4c7151e4fe1bb297d8dbeae14850b4.scope - libcontainer container a71550a52aae30efe79ff07619c27e2b0c4c7151e4fe1bb297d8dbeae14850b4. Feb 13 15:58:12.491207 containerd[1564]: time="2025-02-13T15:58:12.491182327Z" level=info msg="StartContainer for \"a71550a52aae30efe79ff07619c27e2b0c4c7151e4fe1bb297d8dbeae14850b4\" returns successfully" Feb 13 15:58:12.703677 kubelet[2920]: I0213 15:58:12.703340 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-jnfjq" podStartSLOduration=26.703329433 podStartE2EDuration="26.703329433s" podCreationTimestamp="2025-02-13 15:57:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 15:58:12.703216456 +0000 UTC m=+41.787916438" watchObservedRunningTime="2025-02-13 15:58:12.703329433 +0000 UTC m=+41.788029410" Feb 13 15:58:13.043638 systemd-networkd[1473]: cali14e4ffc7966: Gained IPv6LL Feb 13 15:58:13.108695 systemd-networkd[1473]: caliac34cc288c1: Gained IPv6LL Feb 13 15:58:13.299707 systemd-networkd[1473]: vxlan.calico: Gained IPv6LL Feb 13 15:58:13.491704 systemd-networkd[1473]: cali602f69aac35: Gained IPv6LL Feb 13 15:58:13.555660 systemd-networkd[1473]: cali126ee26ea74: Gained IPv6LL Feb 13 15:58:15.156910 containerd[1564]: time="2025-02-13T15:58:15.155940737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:15.156910 containerd[1564]: time="2025-02-13T15:58:15.156422984Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Feb 13 15:58:15.156910 containerd[1564]: time="2025-02-13T15:58:15.156839128Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:15.158579 containerd[1564]: time="2025-02-13T15:58:15.158545654Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:15.159233 containerd[1564]: time="2025-02-13T15:58:15.159211766Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.772895654s" Feb 13 15:58:15.159290 containerd[1564]: time="2025-02-13T15:58:15.159233137Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 15:58:15.161155 containerd[1564]: time="2025-02-13T15:58:15.160910461Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 15:58:15.177146 containerd[1564]: time="2025-02-13T15:58:15.177112890Z" level=info msg="CreateContainer within sandbox \"cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 15:58:15.187462 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1646869401.mount: Deactivated successfully. Feb 13 15:58:15.198776 containerd[1564]: time="2025-02-13T15:58:15.198665722Z" level=info msg="CreateContainer within sandbox \"cbb1b4c02cd4b715e311b39fd1fdb567f83bdfaba2477265261c8e72bdb249a7\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"de09b086c50efa9a90691e5b938dccef4ec4c60ccedc987dd9c3ac7407229f4d\"" Feb 13 15:58:15.199154 containerd[1564]: time="2025-02-13T15:58:15.199138826Z" level=info msg="StartContainer for \"de09b086c50efa9a90691e5b938dccef4ec4c60ccedc987dd9c3ac7407229f4d\"" Feb 13 15:58:15.232033 systemd[1]: run-containerd-runc-k8s.io-de09b086c50efa9a90691e5b938dccef4ec4c60ccedc987dd9c3ac7407229f4d-runc.TlMNhq.mount: Deactivated successfully. Feb 13 15:58:15.238650 systemd[1]: Started cri-containerd-de09b086c50efa9a90691e5b938dccef4ec4c60ccedc987dd9c3ac7407229f4d.scope - libcontainer container de09b086c50efa9a90691e5b938dccef4ec4c60ccedc987dd9c3ac7407229f4d. Feb 13 15:58:15.286872 containerd[1564]: time="2025-02-13T15:58:15.286759667Z" level=info msg="StartContainer for \"de09b086c50efa9a90691e5b938dccef4ec4c60ccedc987dd9c3ac7407229f4d\" returns successfully" Feb 13 15:58:15.772939 containerd[1564]: time="2025-02-13T15:58:15.772908568Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:15.783566 containerd[1564]: time="2025-02-13T15:58:15.783495945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Feb 13 15:58:15.791675 containerd[1564]: time="2025-02-13T15:58:15.784780938Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 623.846213ms" Feb 13 15:58:15.791675 containerd[1564]: time="2025-02-13T15:58:15.784800861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 15:58:15.800452 containerd[1564]: time="2025-02-13T15:58:15.800434528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Feb 13 15:58:15.836574 containerd[1564]: time="2025-02-13T15:58:15.836521741Z" level=info msg="CreateContainer within sandbox \"3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 15:58:15.919329 containerd[1564]: time="2025-02-13T15:58:15.919276877Z" level=info msg="CreateContainer within sandbox \"3030ed4d2b5acb31cd17b84f9ef3082719fd02ba9cb3a1a2975f19f385681b2d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2ad9e9448b8d51c88b01b8df0de53f9db2a4ac625780b72f1aee8d210f3fb5ff\"" Feb 13 15:58:15.920514 containerd[1564]: time="2025-02-13T15:58:15.920313344Z" level=info msg="StartContainer for \"2ad9e9448b8d51c88b01b8df0de53f9db2a4ac625780b72f1aee8d210f3fb5ff\"" Feb 13 15:58:15.951159 systemd[1]: Started cri-containerd-2ad9e9448b8d51c88b01b8df0de53f9db2a4ac625780b72f1aee8d210f3fb5ff.scope - libcontainer container 2ad9e9448b8d51c88b01b8df0de53f9db2a4ac625780b72f1aee8d210f3fb5ff. Feb 13 15:58:16.014217 containerd[1564]: time="2025-02-13T15:58:16.014183760Z" level=info msg="StartContainer for \"2ad9e9448b8d51c88b01b8df0de53f9db2a4ac625780b72f1aee8d210f3fb5ff\" returns successfully" Feb 13 15:58:16.439097 kubelet[2920]: I0213 15:58:16.438833 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8dd87f54d-92htb" podStartSLOduration=22.038917927 podStartE2EDuration="25.438819377s" podCreationTimestamp="2025-02-13 15:57:51 +0000 UTC" firstStartedPulling="2025-02-13 15:58:11.760193598 +0000 UTC m=+40.844893571" lastFinishedPulling="2025-02-13 15:58:15.160095042 +0000 UTC m=+44.244795021" observedRunningTime="2025-02-13 15:58:15.676588735 +0000 UTC m=+44.761288719" watchObservedRunningTime="2025-02-13 15:58:16.438819377 +0000 UTC m=+45.523519354" Feb 13 15:58:16.676335 kubelet[2920]: I0213 15:58:16.676191 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-8dd87f54d-nz7vq" podStartSLOduration=21.659868338 podStartE2EDuration="25.676178954s" podCreationTimestamp="2025-02-13 15:57:51 +0000 UTC" firstStartedPulling="2025-02-13 15:58:11.768906923 +0000 UTC m=+40.853606897" lastFinishedPulling="2025-02-13 15:58:15.78521754 +0000 UTC m=+44.869917513" observedRunningTime="2025-02-13 15:58:16.675680355 +0000 UTC m=+45.760380333" watchObservedRunningTime="2025-02-13 15:58:16.676178954 +0000 UTC m=+45.760878931" Feb 13 15:58:17.397987 systemd[1]: Started sshd@15-139.178.70.106:22-187.62.205.20:51242.service - OpenSSH per-connection server daemon (187.62.205.20:51242). Feb 13 15:58:17.707747 kubelet[2920]: I0213 15:58:17.707477 2920 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 15:58:18.266010 containerd[1564]: time="2025-02-13T15:58:18.265967507Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:18.266985 containerd[1564]: time="2025-02-13T15:58:18.266814351Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Feb 13 15:58:18.267450 containerd[1564]: time="2025-02-13T15:58:18.267236992Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:18.268836 containerd[1564]: time="2025-02-13T15:58:18.268804182Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:18.269463 containerd[1564]: time="2025-02-13T15:58:18.269436885Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.468799148s" Feb 13 15:58:18.269507 containerd[1564]: time="2025-02-13T15:58:18.269462682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Feb 13 15:58:18.270429 containerd[1564]: time="2025-02-13T15:58:18.270264965Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 15:58:18.287076 containerd[1564]: time="2025-02-13T15:58:18.287008937Z" level=info msg="CreateContainer within sandbox \"974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Feb 13 15:58:18.298244 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2082544099.mount: Deactivated successfully. Feb 13 15:58:18.298594 containerd[1564]: time="2025-02-13T15:58:18.298543994Z" level=info msg="CreateContainer within sandbox \"974d309c78d3a94ea67d9371169184926e8af69c52b30b631bebf2690ada3d74\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"3e313fc43d6c840f0f3bdf71f4c4cf0d2ae985aa0b830e3e9b340b7af9c32958\"" Feb 13 15:58:18.314584 containerd[1564]: time="2025-02-13T15:58:18.314546468Z" level=info msg="StartContainer for \"3e313fc43d6c840f0f3bdf71f4c4cf0d2ae985aa0b830e3e9b340b7af9c32958\"" Feb 13 15:58:18.349675 systemd[1]: Started cri-containerd-3e313fc43d6c840f0f3bdf71f4c4cf0d2ae985aa0b830e3e9b340b7af9c32958.scope - libcontainer container 3e313fc43d6c840f0f3bdf71f4c4cf0d2ae985aa0b830e3e9b340b7af9c32958. Feb 13 15:58:18.394168 containerd[1564]: time="2025-02-13T15:58:18.394140390Z" level=info msg="StartContainer for \"3e313fc43d6c840f0f3bdf71f4c4cf0d2ae985aa0b830e3e9b340b7af9c32958\" returns successfully" Feb 13 15:58:18.721663 kubelet[2920]: I0213 15:58:18.721362 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5ddbbb799c-mbdnx" podStartSLOduration=21.282735458 podStartE2EDuration="27.721344429s" podCreationTimestamp="2025-02-13 15:57:51 +0000 UTC" firstStartedPulling="2025-02-13 15:58:11.831542812 +0000 UTC m=+40.916242785" lastFinishedPulling="2025-02-13 15:58:18.270151778 +0000 UTC m=+47.354851756" observedRunningTime="2025-02-13 15:58:18.721113367 +0000 UTC m=+47.805813358" watchObservedRunningTime="2025-02-13 15:58:18.721344429 +0000 UTC m=+47.806044415" Feb 13 15:58:18.901463 sshd[5783]: Invalid user memuser from 187.62.205.20 port 51242 Feb 13 15:58:19.118254 sshd[5783]: Received disconnect from 187.62.205.20 port 51242:11: Bye Bye [preauth] Feb 13 15:58:19.118254 sshd[5783]: Disconnected from invalid user memuser 187.62.205.20 port 51242 [preauth] Feb 13 15:58:19.120307 systemd[1]: sshd@15-139.178.70.106:22-187.62.205.20:51242.service: Deactivated successfully. Feb 13 15:58:21.626741 containerd[1564]: time="2025-02-13T15:58:21.626701452Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:21.699365 containerd[1564]: time="2025-02-13T15:58:21.699300510Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 15:58:21.750600 containerd[1564]: time="2025-02-13T15:58:21.750366631Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:21.826994 containerd[1564]: time="2025-02-13T15:58:21.826929324Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 15:58:21.827863 containerd[1564]: time="2025-02-13T15:58:21.827505268Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 3.557217681s" Feb 13 15:58:21.827863 containerd[1564]: time="2025-02-13T15:58:21.827532145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 15:58:21.844644 containerd[1564]: time="2025-02-13T15:58:21.844587519Z" level=info msg="CreateContainer within sandbox \"dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 15:58:21.997699 containerd[1564]: time="2025-02-13T15:58:21.997644257Z" level=info msg="CreateContainer within sandbox \"dded73e6652b275e7c5bd33f5ebccc89d61bae1faa04f90b1ad3a4ce5a56344c\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"39386ea528c24408cb284db8d403f5efd28f5760d40528c037ea08746c19f652\"" Feb 13 15:58:22.000092 containerd[1564]: time="2025-02-13T15:58:21.998143569Z" level=info msg="StartContainer for \"39386ea528c24408cb284db8d403f5efd28f5760d40528c037ea08746c19f652\"" Feb 13 15:58:22.040715 systemd[1]: Started cri-containerd-39386ea528c24408cb284db8d403f5efd28f5760d40528c037ea08746c19f652.scope - libcontainer container 39386ea528c24408cb284db8d403f5efd28f5760d40528c037ea08746c19f652. Feb 13 15:58:22.063180 containerd[1564]: time="2025-02-13T15:58:22.063152236Z" level=info msg="StartContainer for \"39386ea528c24408cb284db8d403f5efd28f5760d40528c037ea08746c19f652\" returns successfully" Feb 13 15:58:22.729837 kubelet[2920]: I0213 15:58:22.729376 2920 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fz6cz" podStartSLOduration=20.407849911 podStartE2EDuration="31.729364626s" podCreationTimestamp="2025-02-13 15:57:51 +0000 UTC" firstStartedPulling="2025-02-13 15:58:10.50698314 +0000 UTC m=+39.591683114" lastFinishedPulling="2025-02-13 15:58:21.828497849 +0000 UTC m=+50.913197829" observedRunningTime="2025-02-13 15:58:22.728509378 +0000 UTC m=+51.813209360" watchObservedRunningTime="2025-02-13 15:58:22.729364626 +0000 UTC m=+51.814064602" Feb 13 15:58:23.266467 kubelet[2920]: I0213 15:58:23.266429 2920 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 15:58:23.267414 kubelet[2920]: I0213 15:58:23.267405 2920 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 15:58:30.715772 systemd[1]: Started sshd@16-139.178.70.106:22-150.138.115.76:38924.service - OpenSSH per-connection server daemon (150.138.115.76:38924). Feb 13 15:58:31.092302 containerd[1564]: time="2025-02-13T15:58:31.092270798Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:31.118115 containerd[1564]: time="2025-02-13T15:58:31.092579790Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:31.118115 containerd[1564]: time="2025-02-13T15:58:31.118068733Z" level=info msg="StopPodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:31.143945 containerd[1564]: time="2025-02-13T15:58:31.143809798Z" level=info msg="RemovePodSandbox for \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:31.155914 containerd[1564]: time="2025-02-13T15:58:31.155880593Z" level=info msg="Forcibly stopping sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\"" Feb 13 15:58:31.161741 containerd[1564]: time="2025-02-13T15:58:31.155979879Z" level=info msg="TearDown network for sandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" successfully" Feb 13 15:58:31.166376 containerd[1564]: time="2025-02-13T15:58:31.166350074Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.170246 containerd[1564]: time="2025-02-13T15:58:31.170230748Z" level=info msg="RemovePodSandbox \"7a68585f258ea01e0ae766ce3f73b1dc5b6efe6f1232ed991d56357d5d5341e9\" returns successfully" Feb 13 15:58:31.170774 containerd[1564]: time="2025-02-13T15:58:31.170626417Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:31.170774 containerd[1564]: time="2025-02-13T15:58:31.170683365Z" level=info msg="TearDown network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" successfully" Feb 13 15:58:31.170774 containerd[1564]: time="2025-02-13T15:58:31.170690281Z" level=info msg="StopPodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" returns successfully" Feb 13 15:58:31.170986 containerd[1564]: time="2025-02-13T15:58:31.170976189Z" level=info msg="RemovePodSandbox for \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:31.171593 containerd[1564]: time="2025-02-13T15:58:31.171055745Z" level=info msg="Forcibly stopping sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\"" Feb 13 15:58:31.171593 containerd[1564]: time="2025-02-13T15:58:31.171104137Z" level=info msg="TearDown network for sandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" successfully" Feb 13 15:58:31.174619 containerd[1564]: time="2025-02-13T15:58:31.174600184Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.174658 containerd[1564]: time="2025-02-13T15:58:31.174630044Z" level=info msg="RemovePodSandbox \"9a4fdcecf919b9868a17d60f7edb6c432e0cf7237b705ef8c4842037fd5cf856\" returns successfully" Feb 13 15:58:31.174838 containerd[1564]: time="2025-02-13T15:58:31.174824066Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" Feb 13 15:58:31.174889 containerd[1564]: time="2025-02-13T15:58:31.174876477Z" level=info msg="TearDown network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" successfully" Feb 13 15:58:31.174889 containerd[1564]: time="2025-02-13T15:58:31.174885705Z" level=info msg="StopPodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" returns successfully" Feb 13 15:58:31.175040 containerd[1564]: time="2025-02-13T15:58:31.175024631Z" level=info msg="RemovePodSandbox for \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" Feb 13 15:58:31.175067 containerd[1564]: time="2025-02-13T15:58:31.175057357Z" level=info msg="Forcibly stopping sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\"" Feb 13 15:58:31.175111 containerd[1564]: time="2025-02-13T15:58:31.175089485Z" level=info msg="TearDown network for sandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" successfully" Feb 13 15:58:31.179843 containerd[1564]: time="2025-02-13T15:58:31.179826869Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.179877 containerd[1564]: time="2025-02-13T15:58:31.179851575Z" level=info msg="RemovePodSandbox \"44af470e47d81dd7c3ec896d159a1bf96aa5667fb9d479599d012fd1d1df7cae\" returns successfully" Feb 13 15:58:31.180026 containerd[1564]: time="2025-02-13T15:58:31.180012699Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\"" Feb 13 15:58:31.180071 containerd[1564]: time="2025-02-13T15:58:31.180057755Z" level=info msg="TearDown network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" successfully" Feb 13 15:58:31.180071 containerd[1564]: time="2025-02-13T15:58:31.180069081Z" level=info msg="StopPodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" returns successfully" Feb 13 15:58:31.180222 containerd[1564]: time="2025-02-13T15:58:31.180208672Z" level=info msg="RemovePodSandbox for \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\"" Feb 13 15:58:31.180249 containerd[1564]: time="2025-02-13T15:58:31.180223309Z" level=info msg="Forcibly stopping sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\"" Feb 13 15:58:31.180299 containerd[1564]: time="2025-02-13T15:58:31.180255131Z" level=info msg="TearDown network for sandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" successfully" Feb 13 15:58:31.181843 containerd[1564]: time="2025-02-13T15:58:31.181825992Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.185188 containerd[1564]: time="2025-02-13T15:58:31.185170616Z" level=info msg="RemovePodSandbox \"ed9fd68e874e6ca2798ce2b524a071a870e7b4c42f018b65757e8b937b067a8b\" returns successfully" Feb 13 15:58:31.185336 containerd[1564]: time="2025-02-13T15:58:31.185323688Z" level=info msg="StopPodSandbox for \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\"" Feb 13 15:58:31.185408 containerd[1564]: time="2025-02-13T15:58:31.185383439Z" level=info msg="TearDown network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" successfully" Feb 13 15:58:31.185408 containerd[1564]: time="2025-02-13T15:58:31.185393056Z" level=info msg="StopPodSandbox for \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" returns successfully" Feb 13 15:58:31.186416 containerd[1564]: time="2025-02-13T15:58:31.185611156Z" level=info msg="RemovePodSandbox for \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\"" Feb 13 15:58:31.186416 containerd[1564]: time="2025-02-13T15:58:31.185624978Z" level=info msg="Forcibly stopping sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\"" Feb 13 15:58:31.186416 containerd[1564]: time="2025-02-13T15:58:31.185660322Z" level=info msg="TearDown network for sandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" successfully" Feb 13 15:58:31.186807 containerd[1564]: time="2025-02-13T15:58:31.186794554Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.186864 containerd[1564]: time="2025-02-13T15:58:31.186854900Z" level=info msg="RemovePodSandbox \"1e4f1ec70e5bf7968e89175f6384b8edbfce030cc9fdb709042a796222078a02\" returns successfully" Feb 13 15:58:31.187075 containerd[1564]: time="2025-02-13T15:58:31.187060740Z" level=info msg="StopPodSandbox for \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\"" Feb 13 15:58:31.187114 containerd[1564]: time="2025-02-13T15:58:31.187102978Z" level=info msg="TearDown network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\" successfully" Feb 13 15:58:31.187114 containerd[1564]: time="2025-02-13T15:58:31.187112025Z" level=info msg="StopPodSandbox for \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\" returns successfully" Feb 13 15:58:31.187272 containerd[1564]: time="2025-02-13T15:58:31.187260275Z" level=info msg="RemovePodSandbox for \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\"" Feb 13 15:58:31.187298 containerd[1564]: time="2025-02-13T15:58:31.187272355Z" level=info msg="Forcibly stopping sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\"" Feb 13 15:58:31.187349 containerd[1564]: time="2025-02-13T15:58:31.187301350Z" level=info msg="TearDown network for sandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\" successfully" Feb 13 15:58:31.188579 containerd[1564]: time="2025-02-13T15:58:31.188563203Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.189283 containerd[1564]: time="2025-02-13T15:58:31.188584695Z" level=info msg="RemovePodSandbox \"d880819eed2765f13499bddfc04cd1f2082d3edc014984de631089507289b29e\" returns successfully" Feb 13 15:58:31.189283 containerd[1564]: time="2025-02-13T15:58:31.188821643Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:31.189283 containerd[1564]: time="2025-02-13T15:58:31.188866434Z" level=info msg="TearDown network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" successfully" Feb 13 15:58:31.189283 containerd[1564]: time="2025-02-13T15:58:31.188872926Z" level=info msg="StopPodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" returns successfully" Feb 13 15:58:31.189283 containerd[1564]: time="2025-02-13T15:58:31.189098934Z" level=info msg="RemovePodSandbox for \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:31.189283 containerd[1564]: time="2025-02-13T15:58:31.189111461Z" level=info msg="Forcibly stopping sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\"" Feb 13 15:58:31.189283 containerd[1564]: time="2025-02-13T15:58:31.189214212Z" level=info msg="TearDown network for sandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" successfully" Feb 13 15:58:31.191106 containerd[1564]: time="2025-02-13T15:58:31.191094211Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.191259 containerd[1564]: time="2025-02-13T15:58:31.191160599Z" level=info msg="RemovePodSandbox \"84a2389c72b73ee96843c64c0b7706535f2e409df62e20f3918fa8544fc0528a\" returns successfully" Feb 13 15:58:31.191305 containerd[1564]: time="2025-02-13T15:58:31.191291001Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" Feb 13 15:58:31.191356 containerd[1564]: time="2025-02-13T15:58:31.191341998Z" level=info msg="TearDown network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" successfully" Feb 13 15:58:31.191387 containerd[1564]: time="2025-02-13T15:58:31.191354174Z" level=info msg="StopPodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" returns successfully" Feb 13 15:58:31.191561 containerd[1564]: time="2025-02-13T15:58:31.191539778Z" level=info msg="RemovePodSandbox for \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" Feb 13 15:58:31.191592 containerd[1564]: time="2025-02-13T15:58:31.191569093Z" level=info msg="Forcibly stopping sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\"" Feb 13 15:58:31.191625 containerd[1564]: time="2025-02-13T15:58:31.191601767Z" level=info msg="TearDown network for sandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" successfully" Feb 13 15:58:31.193113 containerd[1564]: time="2025-02-13T15:58:31.193096950Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.193147 containerd[1564]: time="2025-02-13T15:58:31.193118943Z" level=info msg="RemovePodSandbox \"744623025ca0bc5ed45d562f16c3bf6b46db744d7b4e32a10580fb4667bc70b6\" returns successfully" Feb 13 15:58:31.193524 containerd[1564]: time="2025-02-13T15:58:31.193361488Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\"" Feb 13 15:58:31.193524 containerd[1564]: time="2025-02-13T15:58:31.193408804Z" level=info msg="TearDown network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" successfully" Feb 13 15:58:31.193524 containerd[1564]: time="2025-02-13T15:58:31.193415276Z" level=info msg="StopPodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" returns successfully" Feb 13 15:58:31.193617 containerd[1564]: time="2025-02-13T15:58:31.193576591Z" level=info msg="RemovePodSandbox for \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\"" Feb 13 15:58:31.193617 containerd[1564]: time="2025-02-13T15:58:31.193587580Z" level=info msg="Forcibly stopping sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\"" Feb 13 15:58:31.193650 containerd[1564]: time="2025-02-13T15:58:31.193619475Z" level=info msg="TearDown network for sandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" successfully" Feb 13 15:58:31.195233 containerd[1564]: time="2025-02-13T15:58:31.195215971Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.195266 containerd[1564]: time="2025-02-13T15:58:31.195238565Z" level=info msg="RemovePodSandbox \"c8d9136008f396605deb5b5f3954272dd428f051e684126a8c6999dfe53762db\" returns successfully" Feb 13 15:58:31.195486 containerd[1564]: time="2025-02-13T15:58:31.195409666Z" level=info msg="StopPodSandbox for \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\"" Feb 13 15:58:31.195486 containerd[1564]: time="2025-02-13T15:58:31.195451515Z" level=info msg="TearDown network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" successfully" Feb 13 15:58:31.195486 containerd[1564]: time="2025-02-13T15:58:31.195457875Z" level=info msg="StopPodSandbox for \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" returns successfully" Feb 13 15:58:31.195608 containerd[1564]: time="2025-02-13T15:58:31.195592216Z" level=info msg="RemovePodSandbox for \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\"" Feb 13 15:58:31.195663 containerd[1564]: time="2025-02-13T15:58:31.195649202Z" level=info msg="Forcibly stopping sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\"" Feb 13 15:58:31.195733 containerd[1564]: time="2025-02-13T15:58:31.195709256Z" level=info msg="TearDown network for sandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" successfully" Feb 13 15:58:31.197367 containerd[1564]: time="2025-02-13T15:58:31.197353180Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.197404 containerd[1564]: time="2025-02-13T15:58:31.197382103Z" level=info msg="RemovePodSandbox \"415741410bd8ea062905ca5c48e3942d9336fbd95481c2fa214d8d5fb1ef5162\" returns successfully" Feb 13 15:58:31.197529 containerd[1564]: time="2025-02-13T15:58:31.197514800Z" level=info msg="StopPodSandbox for \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\"" Feb 13 15:58:31.197626 containerd[1564]: time="2025-02-13T15:58:31.197612237Z" level=info msg="TearDown network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\" successfully" Feb 13 15:58:31.197626 containerd[1564]: time="2025-02-13T15:58:31.197623014Z" level=info msg="StopPodSandbox for \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\" returns successfully" Feb 13 15:58:31.197784 containerd[1564]: time="2025-02-13T15:58:31.197748276Z" level=info msg="RemovePodSandbox for \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\"" Feb 13 15:58:31.197784 containerd[1564]: time="2025-02-13T15:58:31.197761663Z" level=info msg="Forcibly stopping sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\"" Feb 13 15:58:31.197948 containerd[1564]: time="2025-02-13T15:58:31.197904111Z" level=info msg="TearDown network for sandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\" successfully" Feb 13 15:58:31.199262 containerd[1564]: time="2025-02-13T15:58:31.199201656Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.199262 containerd[1564]: time="2025-02-13T15:58:31.199223127Z" level=info msg="RemovePodSandbox \"bacb477aa1280696f6ace6c0a2b7435dd5e47b45049997b4a8f78907ac6d831b\" returns successfully" Feb 13 15:58:31.199373 containerd[1564]: time="2025-02-13T15:58:31.199359141Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:31.199426 containerd[1564]: time="2025-02-13T15:58:31.199407209Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:31.199426 containerd[1564]: time="2025-02-13T15:58:31.199417139Z" level=info msg="StopPodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:31.199808 containerd[1564]: time="2025-02-13T15:58:31.199508247Z" level=info msg="RemovePodSandbox for \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:31.199808 containerd[1564]: time="2025-02-13T15:58:31.199518114Z" level=info msg="Forcibly stopping sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\"" Feb 13 15:58:31.199808 containerd[1564]: time="2025-02-13T15:58:31.199578039Z" level=info msg="TearDown network for sandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" successfully" Feb 13 15:58:31.201103 containerd[1564]: time="2025-02-13T15:58:31.201087884Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.201206 containerd[1564]: time="2025-02-13T15:58:31.201110823Z" level=info msg="RemovePodSandbox \"e5acb3aee965fc1273da65242accb0e8fae6297036703918b89e9ddf93063da6\" returns successfully" Feb 13 15:58:31.201278 containerd[1564]: time="2025-02-13T15:58:31.201252623Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:31.201327 containerd[1564]: time="2025-02-13T15:58:31.201315171Z" level=info msg="TearDown network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" successfully" Feb 13 15:58:31.201327 containerd[1564]: time="2025-02-13T15:58:31.201323701Z" level=info msg="StopPodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" returns successfully" Feb 13 15:58:31.201458 containerd[1564]: time="2025-02-13T15:58:31.201445062Z" level=info msg="RemovePodSandbox for \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:31.201484 containerd[1564]: time="2025-02-13T15:58:31.201458832Z" level=info msg="Forcibly stopping sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\"" Feb 13 15:58:31.201520 containerd[1564]: time="2025-02-13T15:58:31.201508339Z" level=info msg="TearDown network for sandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" successfully" Feb 13 15:58:31.203183 containerd[1564]: time="2025-02-13T15:58:31.203167727Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.203628 containerd[1564]: time="2025-02-13T15:58:31.203191452Z" level=info msg="RemovePodSandbox \"7a5f0e30f6dde0c88933a96efae6d27ae062875de6cdc59c8526ea8ddb1ff9cb\" returns successfully" Feb 13 15:58:31.203628 containerd[1564]: time="2025-02-13T15:58:31.203446098Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" Feb 13 15:58:31.203628 containerd[1564]: time="2025-02-13T15:58:31.203491306Z" level=info msg="TearDown network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" successfully" Feb 13 15:58:31.203628 containerd[1564]: time="2025-02-13T15:58:31.203498043Z" level=info msg="StopPodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" returns successfully" Feb 13 15:58:31.203736 containerd[1564]: time="2025-02-13T15:58:31.203658418Z" level=info msg="RemovePodSandbox for \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" Feb 13 15:58:31.203736 containerd[1564]: time="2025-02-13T15:58:31.203676380Z" level=info msg="Forcibly stopping sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\"" Feb 13 15:58:31.203736 containerd[1564]: time="2025-02-13T15:58:31.203716343Z" level=info msg="TearDown network for sandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" successfully" Feb 13 15:58:31.205271 containerd[1564]: time="2025-02-13T15:58:31.205257331Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.205367 containerd[1564]: time="2025-02-13T15:58:31.205277163Z" level=info msg="RemovePodSandbox \"0ec5f3173767680a895071a9ef5d190e5eef9f93ab72426d82945ef99a5e8db9\" returns successfully" Feb 13 15:58:31.205439 containerd[1564]: time="2025-02-13T15:58:31.205423960Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\"" Feb 13 15:58:31.205512 containerd[1564]: time="2025-02-13T15:58:31.205466321Z" level=info msg="TearDown network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" successfully" Feb 13 15:58:31.205512 containerd[1564]: time="2025-02-13T15:58:31.205472326Z" level=info msg="StopPodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" returns successfully" Feb 13 15:58:31.205694 containerd[1564]: time="2025-02-13T15:58:31.205680069Z" level=info msg="RemovePodSandbox for \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\"" Feb 13 15:58:31.205694 containerd[1564]: time="2025-02-13T15:58:31.205692770Z" level=info msg="Forcibly stopping sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\"" Feb 13 15:58:31.205754 containerd[1564]: time="2025-02-13T15:58:31.205721901Z" level=info msg="TearDown network for sandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" successfully" Feb 13 15:58:31.207462 containerd[1564]: time="2025-02-13T15:58:31.207443477Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.207497 containerd[1564]: time="2025-02-13T15:58:31.207464682Z" level=info msg="RemovePodSandbox \"2a6ab8d445067146921a0bb879f1daa077c5496501b33a7ff998ac6c45cc2614\" returns successfully" Feb 13 15:58:31.207812 containerd[1564]: time="2025-02-13T15:58:31.207700298Z" level=info msg="StopPodSandbox for \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\"" Feb 13 15:58:31.207880 containerd[1564]: time="2025-02-13T15:58:31.207857921Z" level=info msg="TearDown network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" successfully" Feb 13 15:58:31.207999 containerd[1564]: time="2025-02-13T15:58:31.207912211Z" level=info msg="StopPodSandbox for \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" returns successfully" Feb 13 15:58:31.208052 containerd[1564]: time="2025-02-13T15:58:31.208030849Z" level=info msg="RemovePodSandbox for \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\"" Feb 13 15:58:31.208052 containerd[1564]: time="2025-02-13T15:58:31.208042524Z" level=info msg="Forcibly stopping sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\"" Feb 13 15:58:31.208176 containerd[1564]: time="2025-02-13T15:58:31.208150698Z" level=info msg="TearDown network for sandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" successfully" Feb 13 15:58:31.209518 containerd[1564]: time="2025-02-13T15:58:31.209502750Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.209588 containerd[1564]: time="2025-02-13T15:58:31.209526935Z" level=info msg="RemovePodSandbox \"5e739ad0736adfa01ac634316ee706cdf412ae2c332512eeaf950df74f0a8e4a\" returns successfully" Feb 13 15:58:31.209847 containerd[1564]: time="2025-02-13T15:58:31.209717160Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:31.209847 containerd[1564]: time="2025-02-13T15:58:31.209755300Z" level=info msg="TearDown network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" successfully" Feb 13 15:58:31.209847 containerd[1564]: time="2025-02-13T15:58:31.209761583Z" level=info msg="StopPodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" returns successfully" Feb 13 15:58:31.209922 containerd[1564]: time="2025-02-13T15:58:31.209882350Z" level=info msg="RemovePodSandbox for \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:31.209922 containerd[1564]: time="2025-02-13T15:58:31.209892115Z" level=info msg="Forcibly stopping sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\"" Feb 13 15:58:31.209960 containerd[1564]: time="2025-02-13T15:58:31.209923566Z" level=info msg="TearDown network for sandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" successfully" Feb 13 15:58:31.211174 containerd[1564]: time="2025-02-13T15:58:31.211160581Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.211206 containerd[1564]: time="2025-02-13T15:58:31.211183420Z" level=info msg="RemovePodSandbox \"e54df6dcbe960aa7766d79f7c4566faed7f0f88bdc50dd8206899f46f952a38a\" returns successfully" Feb 13 15:58:31.211371 containerd[1564]: time="2025-02-13T15:58:31.211340156Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" Feb 13 15:58:31.211398 containerd[1564]: time="2025-02-13T15:58:31.211389031Z" level=info msg="TearDown network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" successfully" Feb 13 15:58:31.211436 containerd[1564]: time="2025-02-13T15:58:31.211398094Z" level=info msg="StopPodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" returns successfully" Feb 13 15:58:31.211664 containerd[1564]: time="2025-02-13T15:58:31.211562692Z" level=info msg="RemovePodSandbox for \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" Feb 13 15:58:31.211664 containerd[1564]: time="2025-02-13T15:58:31.211622583Z" level=info msg="Forcibly stopping sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\"" Feb 13 15:58:31.212597 containerd[1564]: time="2025-02-13T15:58:31.211751489Z" level=info msg="TearDown network for sandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" successfully" Feb 13 15:58:31.213944 containerd[1564]: time="2025-02-13T15:58:31.213814807Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.213944 containerd[1564]: time="2025-02-13T15:58:31.213850576Z" level=info msg="RemovePodSandbox \"cb088ff1a1c52e42848e4257d99ab311dff05d6b811cf76f01628a22c540d1c8\" returns successfully" Feb 13 15:58:31.215676 containerd[1564]: time="2025-02-13T15:58:31.215661004Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\"" Feb 13 15:58:31.215848 containerd[1564]: time="2025-02-13T15:58:31.215715365Z" level=info msg="TearDown network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" successfully" Feb 13 15:58:31.215848 containerd[1564]: time="2025-02-13T15:58:31.215847388Z" level=info msg="StopPodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" returns successfully" Feb 13 15:58:31.224860 containerd[1564]: time="2025-02-13T15:58:31.224844026Z" level=info msg="RemovePodSandbox for \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\"" Feb 13 15:58:31.224902 containerd[1564]: time="2025-02-13T15:58:31.224860038Z" level=info msg="Forcibly stopping sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\"" Feb 13 15:58:31.224924 containerd[1564]: time="2025-02-13T15:58:31.224893790Z" level=info msg="TearDown network for sandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" successfully" Feb 13 15:58:31.226294 containerd[1564]: time="2025-02-13T15:58:31.226278553Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.226703 containerd[1564]: time="2025-02-13T15:58:31.226305320Z" level=info msg="RemovePodSandbox \"60f308f7d779a7396a8a53c9e1dd79a987c15e289828ded3b814149f7b107c48\" returns successfully" Feb 13 15:58:31.226703 containerd[1564]: time="2025-02-13T15:58:31.226496401Z" level=info msg="StopPodSandbox for \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\"" Feb 13 15:58:31.226703 containerd[1564]: time="2025-02-13T15:58:31.226554333Z" level=info msg="TearDown network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" successfully" Feb 13 15:58:31.226703 containerd[1564]: time="2025-02-13T15:58:31.226577429Z" level=info msg="StopPodSandbox for \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" returns successfully" Feb 13 15:58:31.226830 containerd[1564]: time="2025-02-13T15:58:31.226705828Z" level=info msg="RemovePodSandbox for \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\"" Feb 13 15:58:31.226830 containerd[1564]: time="2025-02-13T15:58:31.226723339Z" level=info msg="Forcibly stopping sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\"" Feb 13 15:58:31.226830 containerd[1564]: time="2025-02-13T15:58:31.226758595Z" level=info msg="TearDown network for sandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" successfully" Feb 13 15:58:31.227807 containerd[1564]: time="2025-02-13T15:58:31.227791816Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.227846 containerd[1564]: time="2025-02-13T15:58:31.227820693Z" level=info msg="RemovePodSandbox \"98dff2cbaff989c06373d2cad8a0c5aee13f19bd99b5af9598bceb2765395bdc\" returns successfully" Feb 13 15:58:31.227994 containerd[1564]: time="2025-02-13T15:58:31.227981045Z" level=info msg="StopPodSandbox for \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\"" Feb 13 15:58:31.228292 containerd[1564]: time="2025-02-13T15:58:31.228020714Z" level=info msg="TearDown network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\" successfully" Feb 13 15:58:31.228292 containerd[1564]: time="2025-02-13T15:58:31.228026519Z" level=info msg="StopPodSandbox for \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\" returns successfully" Feb 13 15:58:31.228292 containerd[1564]: time="2025-02-13T15:58:31.228156371Z" level=info msg="RemovePodSandbox for \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\"" Feb 13 15:58:31.228292 containerd[1564]: time="2025-02-13T15:58:31.228166869Z" level=info msg="Forcibly stopping sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\"" Feb 13 15:58:31.228292 containerd[1564]: time="2025-02-13T15:58:31.228194656Z" level=info msg="TearDown network for sandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\" successfully" Feb 13 15:58:31.229328 containerd[1564]: time="2025-02-13T15:58:31.229313818Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.229359 containerd[1564]: time="2025-02-13T15:58:31.229335810Z" level=info msg="RemovePodSandbox \"e767c3afe4ec4305c05f56562d9f0048bbb05c2d06b1b897db662298a0f39f8a\" returns successfully" Feb 13 15:58:31.229586 containerd[1564]: time="2025-02-13T15:58:31.229480599Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:31.229586 containerd[1564]: time="2025-02-13T15:58:31.229521464Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:31.229586 containerd[1564]: time="2025-02-13T15:58:31.229527719Z" level=info msg="StopPodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:31.229794 containerd[1564]: time="2025-02-13T15:58:31.229749807Z" level=info msg="RemovePodSandbox for \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:31.229794 containerd[1564]: time="2025-02-13T15:58:31.229761687Z" level=info msg="Forcibly stopping sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\"" Feb 13 15:58:31.229838 containerd[1564]: time="2025-02-13T15:58:31.229801593Z" level=info msg="TearDown network for sandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" successfully" Feb 13 15:58:31.231402 containerd[1564]: time="2025-02-13T15:58:31.231387332Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.231436 containerd[1564]: time="2025-02-13T15:58:31.231408806Z" level=info msg="RemovePodSandbox \"375e39079914f6a04e8c80a17e71980b09e80c9de9b1bf879e951665e82fa849\" returns successfully" Feb 13 15:58:31.231703 containerd[1564]: time="2025-02-13T15:58:31.231625837Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:31.231703 containerd[1564]: time="2025-02-13T15:58:31.231668083Z" level=info msg="TearDown network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" successfully" Feb 13 15:58:31.231703 containerd[1564]: time="2025-02-13T15:58:31.231674803Z" level=info msg="StopPodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" returns successfully" Feb 13 15:58:31.231878 containerd[1564]: time="2025-02-13T15:58:31.231863493Z" level=info msg="RemovePodSandbox for \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:31.231878 containerd[1564]: time="2025-02-13T15:58:31.231877432Z" level=info msg="Forcibly stopping sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\"" Feb 13 15:58:31.231924 containerd[1564]: time="2025-02-13T15:58:31.231909619Z" level=info msg="TearDown network for sandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" successfully" Feb 13 15:58:31.235369 containerd[1564]: time="2025-02-13T15:58:31.235355001Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.235398 containerd[1564]: time="2025-02-13T15:58:31.235375780Z" level=info msg="RemovePodSandbox \"c2ced68572f9b3d83e94feb650d420673091f63aa4fe057235ce6b4f19676c3a\" returns successfully" Feb 13 15:58:31.235519 containerd[1564]: time="2025-02-13T15:58:31.235507299Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" Feb 13 15:58:31.235585 containerd[1564]: time="2025-02-13T15:58:31.235546712Z" level=info msg="TearDown network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" successfully" Feb 13 15:58:31.235585 containerd[1564]: time="2025-02-13T15:58:31.235567710Z" level=info msg="StopPodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" returns successfully" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.235783588Z" level=info msg="RemovePodSandbox for \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.235792908Z" level=info msg="Forcibly stopping sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\"" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.235844888Z" level=info msg="TearDown network for sandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" successfully" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237261634Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237279575Z" level=info msg="RemovePodSandbox \"09ffb308bc1014bacea8f49104294acfcb492b750f42d368a8d999efd8f0672a\" returns successfully" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237401097Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\"" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237438494Z" level=info msg="TearDown network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" successfully" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237444280Z" level=info msg="StopPodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" returns successfully" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237571767Z" level=info msg="RemovePodSandbox for \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\"" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237583677Z" level=info msg="Forcibly stopping sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\"" Feb 13 15:58:31.238417 containerd[1564]: time="2025-02-13T15:58:31.237615016Z" level=info msg="TearDown network for sandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" successfully" Feb 13 15:58:31.239263 containerd[1564]: time="2025-02-13T15:58:31.239009212Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.239263 containerd[1564]: time="2025-02-13T15:58:31.239028888Z" level=info msg="RemovePodSandbox \"45a16c84c5b0a113cdfabe13c5b28989ed865b1c1dcc5f9e77384919df43ab48\" returns successfully" Feb 13 15:58:31.239263 containerd[1564]: time="2025-02-13T15:58:31.239201730Z" level=info msg="StopPodSandbox for \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\"" Feb 13 15:58:31.239263 containerd[1564]: time="2025-02-13T15:58:31.239237744Z" level=info msg="TearDown network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" successfully" Feb 13 15:58:31.239857 containerd[1564]: time="2025-02-13T15:58:31.239267176Z" level=info msg="StopPodSandbox for \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" returns successfully" Feb 13 15:58:31.239857 containerd[1564]: time="2025-02-13T15:58:31.239377300Z" level=info msg="RemovePodSandbox for \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\"" Feb 13 15:58:31.239857 containerd[1564]: time="2025-02-13T15:58:31.239387231Z" level=info msg="Forcibly stopping sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\"" Feb 13 15:58:31.239857 containerd[1564]: time="2025-02-13T15:58:31.239462573Z" level=info msg="TearDown network for sandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" successfully" Feb 13 15:58:31.240534 containerd[1564]: time="2025-02-13T15:58:31.240519147Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.240580 containerd[1564]: time="2025-02-13T15:58:31.240541122Z" level=info msg="RemovePodSandbox \"d52e9a70acf718e0e625f9c8896cfeb3711870013bf82a1276e45be56279f380\" returns successfully" Feb 13 15:58:31.240735 containerd[1564]: time="2025-02-13T15:58:31.240682187Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:31.240783 containerd[1564]: time="2025-02-13T15:58:31.240774330Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:31.240817 containerd[1564]: time="2025-02-13T15:58:31.240810621Z" level=info msg="StopPodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:31.241048 containerd[1564]: time="2025-02-13T15:58:31.240996937Z" level=info msg="RemovePodSandbox for \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:31.241077 containerd[1564]: time="2025-02-13T15:58:31.241050337Z" level=info msg="Forcibly stopping sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\"" Feb 13 15:58:31.241112 containerd[1564]: time="2025-02-13T15:58:31.241084629Z" level=info msg="TearDown network for sandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" successfully" Feb 13 15:58:31.242690 containerd[1564]: time="2025-02-13T15:58:31.242676598Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.242762 containerd[1564]: time="2025-02-13T15:58:31.242696971Z" level=info msg="RemovePodSandbox \"e1051e3e668084ba36d09dece5b34b6817e8dedb01681a46aaf9def17aa0ae97\" returns successfully" Feb 13 15:58:31.242906 containerd[1564]: time="2025-02-13T15:58:31.242831297Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:31.242906 containerd[1564]: time="2025-02-13T15:58:31.242870910Z" level=info msg="TearDown network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" successfully" Feb 13 15:58:31.242906 containerd[1564]: time="2025-02-13T15:58:31.242877074Z" level=info msg="StopPodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" returns successfully" Feb 13 15:58:31.243190 containerd[1564]: time="2025-02-13T15:58:31.243104023Z" level=info msg="RemovePodSandbox for \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:31.243190 containerd[1564]: time="2025-02-13T15:58:31.243116350Z" level=info msg="Forcibly stopping sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\"" Feb 13 15:58:31.243190 containerd[1564]: time="2025-02-13T15:58:31.243160102Z" level=info msg="TearDown network for sandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" successfully" Feb 13 15:58:31.244868 containerd[1564]: time="2025-02-13T15:58:31.244802453Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.244868 containerd[1564]: time="2025-02-13T15:58:31.244825753Z" level=info msg="RemovePodSandbox \"d773a06f6b528d223f8fcc46339f87663109636b11a659ffa4e99f45fe159512\" returns successfully" Feb 13 15:58:31.245033 containerd[1564]: time="2025-02-13T15:58:31.244984367Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" Feb 13 15:58:31.245109 containerd[1564]: time="2025-02-13T15:58:31.245095277Z" level=info msg="TearDown network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" successfully" Feb 13 15:58:31.245109 containerd[1564]: time="2025-02-13T15:58:31.245106090Z" level=info msg="StopPodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" returns successfully" Feb 13 15:58:31.245276 containerd[1564]: time="2025-02-13T15:58:31.245263171Z" level=info msg="RemovePodSandbox for \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" Feb 13 15:58:31.245299 containerd[1564]: time="2025-02-13T15:58:31.245278250Z" level=info msg="Forcibly stopping sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\"" Feb 13 15:58:31.245478 containerd[1564]: time="2025-02-13T15:58:31.245308766Z" level=info msg="TearDown network for sandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" successfully" Feb 13 15:58:31.247066 containerd[1564]: time="2025-02-13T15:58:31.247051940Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.247105 containerd[1564]: time="2025-02-13T15:58:31.247074292Z" level=info msg="RemovePodSandbox \"e38d3f0486dbea7f493e6669fdb585f38ce4c02cf1cf6ce600d1c52b1153c6bc\" returns successfully" Feb 13 15:58:31.247267 containerd[1564]: time="2025-02-13T15:58:31.247186767Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\"" Feb 13 15:58:31.247267 containerd[1564]: time="2025-02-13T15:58:31.247234310Z" level=info msg="TearDown network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" successfully" Feb 13 15:58:31.247267 containerd[1564]: time="2025-02-13T15:58:31.247240734Z" level=info msg="StopPodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" returns successfully" Feb 13 15:58:31.247354 containerd[1564]: time="2025-02-13T15:58:31.247339553Z" level=info msg="RemovePodSandbox for \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\"" Feb 13 15:58:31.247378 containerd[1564]: time="2025-02-13T15:58:31.247353906Z" level=info msg="Forcibly stopping sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\"" Feb 13 15:58:31.247462 containerd[1564]: time="2025-02-13T15:58:31.247439163Z" level=info msg="TearDown network for sandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" successfully" Feb 13 15:58:31.248878 containerd[1564]: time="2025-02-13T15:58:31.248864290Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.248965 containerd[1564]: time="2025-02-13T15:58:31.248884892Z" level=info msg="RemovePodSandbox \"449bd269ba6f3296f1cf5618988c60faafe943f31da45afe25d2fb97c559534e\" returns successfully" Feb 13 15:58:31.249053 containerd[1564]: time="2025-02-13T15:58:31.249042700Z" level=info msg="StopPodSandbox for \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\"" Feb 13 15:58:31.249185 containerd[1564]: time="2025-02-13T15:58:31.249144585Z" level=info msg="TearDown network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" successfully" Feb 13 15:58:31.249185 containerd[1564]: time="2025-02-13T15:58:31.249153027Z" level=info msg="StopPodSandbox for \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" returns successfully" Feb 13 15:58:31.249808 containerd[1564]: time="2025-02-13T15:58:31.249263353Z" level=info msg="RemovePodSandbox for \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\"" Feb 13 15:58:31.249808 containerd[1564]: time="2025-02-13T15:58:31.249275309Z" level=info msg="Forcibly stopping sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\"" Feb 13 15:58:31.249808 containerd[1564]: time="2025-02-13T15:58:31.249306062Z" level=info msg="TearDown network for sandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" successfully" Feb 13 15:58:31.250600 containerd[1564]: time="2025-02-13T15:58:31.250585263Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 15:58:31.250637 containerd[1564]: time="2025-02-13T15:58:31.250605128Z" level=info msg="RemovePodSandbox \"41e861f5b7f7c30acf09f3479838df05d73d5082aba3170b3166c980110ba1a2\" returns successfully" Feb 13 15:58:33.998204 kubelet[2920]: I0213 15:58:33.998097 2920 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 15:58:34.797794 systemd[1]: Started sshd@17-139.178.70.106:22-188.94.154.98:54104.service - OpenSSH per-connection server daemon (188.94.154.98:54104). Feb 13 15:58:36.080082 sshd[5952]: Invalid user debian-tor from 188.94.154.98 port 54104 Feb 13 15:58:36.305259 sshd[5952]: Received disconnect from 188.94.154.98 port 54104:11: Bye Bye [preauth] Feb 13 15:58:36.305259 sshd[5952]: Disconnected from invalid user debian-tor 188.94.154.98 port 54104 [preauth] Feb 13 15:58:36.306540 systemd[1]: sshd@17-139.178.70.106:22-188.94.154.98:54104.service: Deactivated successfully. Feb 13 15:58:47.645456 systemd[1]: Started sshd@18-139.178.70.106:22-147.75.109.163:58488.service - OpenSSH per-connection server daemon (147.75.109.163:58488). Feb 13 15:58:47.721396 sshd[5985]: Accepted publickey for core from 147.75.109.163 port 58488 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:58:47.722782 sshd-session[5985]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:58:47.731317 systemd-logind[1544]: New session 10 of user core. Feb 13 15:58:47.735783 systemd[1]: Started session-10.scope - Session 10 of User core. Feb 13 15:58:48.119610 systemd[1]: run-containerd-runc-k8s.io-3e313fc43d6c840f0f3bdf71f4c4cf0d2ae985aa0b830e3e9b340b7af9c32958-runc.9HUzCC.mount: Deactivated successfully. Feb 13 15:58:48.697852 sshd[5988]: Connection closed by 147.75.109.163 port 58488 Feb 13 15:58:48.698140 sshd-session[5985]: pam_unix(sshd:session): session closed for user core Feb 13 15:58:48.701733 systemd[1]: sshd@18-139.178.70.106:22-147.75.109.163:58488.service: Deactivated successfully. Feb 13 15:58:48.703212 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 15:58:48.703885 systemd-logind[1544]: Session 10 logged out. Waiting for processes to exit. Feb 13 15:58:48.704644 systemd-logind[1544]: Removed session 10. Feb 13 15:58:53.705736 systemd[1]: Started sshd@19-139.178.70.106:22-147.75.109.163:55306.service - OpenSSH per-connection server daemon (147.75.109.163:55306). Feb 13 15:58:53.756392 sshd[6029]: Accepted publickey for core from 147.75.109.163 port 55306 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:58:53.757250 sshd-session[6029]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:58:53.761236 systemd-logind[1544]: New session 11 of user core. Feb 13 15:58:53.767690 systemd[1]: Started session-11.scope - Session 11 of User core. Feb 13 15:58:53.900023 sshd[6031]: Connection closed by 147.75.109.163 port 55306 Feb 13 15:58:53.907585 sshd-session[6029]: pam_unix(sshd:session): session closed for user core Feb 13 15:58:53.910098 systemd-logind[1544]: Session 11 logged out. Waiting for processes to exit. Feb 13 15:58:53.910126 systemd[1]: sshd@19-139.178.70.106:22-147.75.109.163:55306.service: Deactivated successfully. Feb 13 15:58:53.911463 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 15:58:53.912422 systemd-logind[1544]: Removed session 11. Feb 13 15:58:58.911503 systemd[1]: Started sshd@20-139.178.70.106:22-147.75.109.163:55308.service - OpenSSH per-connection server daemon (147.75.109.163:55308). Feb 13 15:58:59.123196 sshd[6046]: Accepted publickey for core from 147.75.109.163 port 55308 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:58:59.124676 sshd-session[6046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:58:59.128767 systemd-logind[1544]: New session 12 of user core. Feb 13 15:58:59.136680 systemd[1]: Started session-12.scope - Session 12 of User core. Feb 13 15:58:59.254968 sshd[6048]: Connection closed by 147.75.109.163 port 55308 Feb 13 15:58:59.256053 sshd-session[6046]: pam_unix(sshd:session): session closed for user core Feb 13 15:58:59.264191 systemd[1]: sshd@20-139.178.70.106:22-147.75.109.163:55308.service: Deactivated successfully. Feb 13 15:58:59.265340 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 15:58:59.266378 systemd-logind[1544]: Session 12 logged out. Waiting for processes to exit. Feb 13 15:58:59.270716 systemd[1]: Started sshd@21-139.178.70.106:22-147.75.109.163:45772.service - OpenSSH per-connection server daemon (147.75.109.163:45772). Feb 13 15:58:59.272071 systemd-logind[1544]: Removed session 12. Feb 13 15:58:59.303670 sshd[6059]: Accepted publickey for core from 147.75.109.163 port 45772 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:58:59.304488 sshd-session[6059]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:58:59.307820 systemd-logind[1544]: New session 13 of user core. Feb 13 15:58:59.312646 systemd[1]: Started session-13.scope - Session 13 of User core. Feb 13 15:58:59.576885 sshd[6062]: Connection closed by 147.75.109.163 port 45772 Feb 13 15:58:59.583883 systemd[1]: Started sshd@22-139.178.70.106:22-147.75.109.163:45788.service - OpenSSH per-connection server daemon (147.75.109.163:45788). Feb 13 15:58:59.599802 sshd-session[6059]: pam_unix(sshd:session): session closed for user core Feb 13 15:58:59.622480 systemd[1]: sshd@21-139.178.70.106:22-147.75.109.163:45772.service: Deactivated successfully. Feb 13 15:58:59.623847 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 15:58:59.624375 systemd-logind[1544]: Session 13 logged out. Waiting for processes to exit. Feb 13 15:58:59.625450 systemd-logind[1544]: Removed session 13. Feb 13 15:58:59.857812 sshd[6069]: Accepted publickey for core from 147.75.109.163 port 45788 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:58:59.867081 sshd-session[6069]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:58:59.877158 systemd-logind[1544]: New session 14 of user core. Feb 13 15:58:59.884660 systemd[1]: Started session-14.scope - Session 14 of User core. Feb 13 15:59:00.025231 sshd[6074]: Connection closed by 147.75.109.163 port 45788 Feb 13 15:59:00.025610 sshd-session[6069]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:00.027589 systemd[1]: sshd@22-139.178.70.106:22-147.75.109.163:45788.service: Deactivated successfully. Feb 13 15:59:00.028863 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 15:59:00.029386 systemd-logind[1544]: Session 14 logged out. Waiting for processes to exit. Feb 13 15:59:00.030027 systemd-logind[1544]: Removed session 14. Feb 13 15:59:05.034652 systemd[1]: Started sshd@23-139.178.70.106:22-147.75.109.163:45794.service - OpenSSH per-connection server daemon (147.75.109.163:45794). Feb 13 15:59:05.069675 sshd[6106]: Accepted publickey for core from 147.75.109.163 port 45794 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:05.070564 sshd-session[6106]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:05.073208 systemd-logind[1544]: New session 15 of user core. Feb 13 15:59:05.084713 systemd[1]: Started session-15.scope - Session 15 of User core. Feb 13 15:59:05.190574 sshd[6108]: Connection closed by 147.75.109.163 port 45794 Feb 13 15:59:05.190827 sshd-session[6106]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:05.192778 systemd[1]: sshd@23-139.178.70.106:22-147.75.109.163:45794.service: Deactivated successfully. Feb 13 15:59:05.193866 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 15:59:05.194493 systemd-logind[1544]: Session 15 logged out. Waiting for processes to exit. Feb 13 15:59:05.195044 systemd-logind[1544]: Removed session 15. Feb 13 15:59:07.438092 systemd[1]: Started sshd@24-139.178.70.106:22-185.213.165.55:37792.service - OpenSSH per-connection server daemon (185.213.165.55:37792). Feb 13 15:59:08.836520 sshd[6123]: Invalid user wang from 185.213.165.55 port 37792 Feb 13 15:59:09.104898 sshd[6123]: Received disconnect from 185.213.165.55 port 37792:11: Bye Bye [preauth] Feb 13 15:59:09.104898 sshd[6123]: Disconnected from invalid user wang 185.213.165.55 port 37792 [preauth] Feb 13 15:59:09.106678 systemd[1]: sshd@24-139.178.70.106:22-185.213.165.55:37792.service: Deactivated successfully. Feb 13 15:59:09.539092 systemd[1]: run-containerd-runc-k8s.io-f45f27d87c96cc914c08383bf75a8facd90d19509ac6b8e4e05ee1a3e423f199-runc.R7i3U6.mount: Deactivated successfully. Feb 13 15:59:10.198424 systemd[1]: Started sshd@25-139.178.70.106:22-147.75.109.163:60434.service - OpenSSH per-connection server daemon (147.75.109.163:60434). Feb 13 15:59:10.653209 sshd[6153]: Accepted publickey for core from 147.75.109.163 port 60434 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:10.661765 sshd-session[6153]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:10.674326 systemd-logind[1544]: New session 16 of user core. Feb 13 15:59:10.676679 systemd[1]: Started session-16.scope - Session 16 of User core. Feb 13 15:59:10.841873 sshd[6155]: Connection closed by 147.75.109.163 port 60434 Feb 13 15:59:10.842259 sshd-session[6153]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:10.845064 systemd[1]: sshd@25-139.178.70.106:22-147.75.109.163:60434.service: Deactivated successfully. Feb 13 15:59:10.846324 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 15:59:10.846865 systemd-logind[1544]: Session 16 logged out. Waiting for processes to exit. Feb 13 15:59:10.847399 systemd-logind[1544]: Removed session 16. Feb 13 15:59:15.853541 systemd[1]: Started sshd@26-139.178.70.106:22-147.75.109.163:60436.service - OpenSSH per-connection server daemon (147.75.109.163:60436). Feb 13 15:59:15.904716 sshd[6173]: Accepted publickey for core from 147.75.109.163 port 60436 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:15.906172 sshd-session[6173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:15.910976 systemd-logind[1544]: New session 17 of user core. Feb 13 15:59:15.918864 systemd[1]: Started session-17.scope - Session 17 of User core. Feb 13 15:59:16.062962 sshd[6175]: Connection closed by 147.75.109.163 port 60436 Feb 13 15:59:16.063738 sshd-session[6173]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:16.078901 systemd[1]: Started sshd@27-139.178.70.106:22-147.75.109.163:60440.service - OpenSSH per-connection server daemon (147.75.109.163:60440). Feb 13 15:59:16.079451 systemd[1]: sshd@26-139.178.70.106:22-147.75.109.163:60436.service: Deactivated successfully. Feb 13 15:59:16.081189 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 15:59:16.084076 systemd-logind[1544]: Session 17 logged out. Waiting for processes to exit. Feb 13 15:59:16.085042 systemd-logind[1544]: Removed session 17. Feb 13 15:59:16.117432 sshd[6183]: Accepted publickey for core from 147.75.109.163 port 60440 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:16.118991 sshd-session[6183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:16.124094 systemd-logind[1544]: New session 18 of user core. Feb 13 15:59:16.128727 systemd[1]: Started session-18.scope - Session 18 of User core. Feb 13 15:59:16.703338 sshd[6188]: Connection closed by 147.75.109.163 port 60440 Feb 13 15:59:16.705590 sshd-session[6183]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:16.711266 systemd[1]: sshd@27-139.178.70.106:22-147.75.109.163:60440.service: Deactivated successfully. Feb 13 15:59:16.712618 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 15:59:16.713588 systemd-logind[1544]: Session 18 logged out. Waiting for processes to exit. Feb 13 15:59:16.717788 systemd[1]: Started sshd@28-139.178.70.106:22-147.75.109.163:60456.service - OpenSSH per-connection server daemon (147.75.109.163:60456). Feb 13 15:59:16.720492 systemd-logind[1544]: Removed session 18. Feb 13 15:59:16.837975 sshd[6197]: Accepted publickey for core from 147.75.109.163 port 60456 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:16.839340 sshd-session[6197]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:16.843098 systemd-logind[1544]: New session 19 of user core. Feb 13 15:59:16.849662 systemd[1]: Started session-19.scope - Session 19 of User core. Feb 13 15:59:18.769498 sshd[6200]: Connection closed by 147.75.109.163 port 60456 Feb 13 15:59:18.768492 sshd-session[6197]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:18.778420 systemd[1]: sshd@28-139.178.70.106:22-147.75.109.163:60456.service: Deactivated successfully. Feb 13 15:59:18.780815 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 15:59:18.780966 systemd[1]: session-19.scope: Consumed 421ms CPU time, 65.8M memory peak. Feb 13 15:59:18.782607 systemd-logind[1544]: Session 19 logged out. Waiting for processes to exit. Feb 13 15:59:18.790801 systemd[1]: Started sshd@29-139.178.70.106:22-147.75.109.163:60468.service - OpenSSH per-connection server daemon (147.75.109.163:60468). Feb 13 15:59:18.793349 systemd-logind[1544]: Removed session 19. Feb 13 15:59:18.870564 sshd[6217]: Accepted publickey for core from 147.75.109.163 port 60468 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:18.870496 sshd-session[6217]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:18.873702 systemd-logind[1544]: New session 20 of user core. Feb 13 15:59:18.879643 systemd[1]: Started session-20.scope - Session 20 of User core. Feb 13 15:59:19.556907 sshd[6222]: Connection closed by 147.75.109.163 port 60468 Feb 13 15:59:19.557526 sshd-session[6217]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:19.567855 systemd[1]: sshd@29-139.178.70.106:22-147.75.109.163:60468.service: Deactivated successfully. Feb 13 15:59:19.569156 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 15:59:19.572467 systemd-logind[1544]: Session 20 logged out. Waiting for processes to exit. Feb 13 15:59:19.577821 systemd[1]: Started sshd@30-139.178.70.106:22-147.75.109.163:58506.service - OpenSSH per-connection server daemon (147.75.109.163:58506). Feb 13 15:59:19.580259 systemd-logind[1544]: Removed session 20. Feb 13 15:59:19.623524 sshd[6232]: Accepted publickey for core from 147.75.109.163 port 58506 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:19.624363 sshd-session[6232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:19.629864 systemd-logind[1544]: New session 21 of user core. Feb 13 15:59:19.635674 systemd[1]: Started session-21.scope - Session 21 of User core. Feb 13 15:59:19.740761 sshd[6235]: Connection closed by 147.75.109.163 port 58506 Feb 13 15:59:19.744380 systemd-logind[1544]: Session 21 logged out. Waiting for processes to exit. Feb 13 15:59:19.741308 sshd-session[6232]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:19.744880 systemd[1]: sshd@30-139.178.70.106:22-147.75.109.163:58506.service: Deactivated successfully. Feb 13 15:59:19.746742 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 15:59:19.747675 systemd-logind[1544]: Removed session 21. Feb 13 15:59:24.750073 systemd[1]: Started sshd@31-139.178.70.106:22-147.75.109.163:58522.service - OpenSSH per-connection server daemon (147.75.109.163:58522). Feb 13 15:59:24.798537 sshd[6250]: Accepted publickey for core from 147.75.109.163 port 58522 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:24.799395 sshd-session[6250]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:24.802336 systemd-logind[1544]: New session 22 of user core. Feb 13 15:59:24.809649 systemd[1]: Started session-22.scope - Session 22 of User core. Feb 13 15:59:24.930331 sshd[6252]: Connection closed by 147.75.109.163 port 58522 Feb 13 15:59:24.931029 sshd-session[6250]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:24.933295 systemd-logind[1544]: Session 22 logged out. Waiting for processes to exit. Feb 13 15:59:24.933415 systemd[1]: sshd@31-139.178.70.106:22-147.75.109.163:58522.service: Deactivated successfully. Feb 13 15:59:24.935050 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 15:59:24.935811 systemd-logind[1544]: Removed session 22. Feb 13 15:59:29.941140 systemd[1]: Started sshd@32-139.178.70.106:22-147.75.109.163:44398.service - OpenSSH per-connection server daemon (147.75.109.163:44398). Feb 13 15:59:29.995234 sshd[6266]: Accepted publickey for core from 147.75.109.163 port 44398 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:29.996195 sshd-session[6266]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:29.999981 systemd-logind[1544]: New session 23 of user core. Feb 13 15:59:30.008704 systemd[1]: Started session-23.scope - Session 23 of User core. Feb 13 15:59:30.164099 sshd[6268]: Connection closed by 147.75.109.163 port 44398 Feb 13 15:59:30.166929 sshd-session[6266]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:30.168713 systemd[1]: sshd@32-139.178.70.106:22-147.75.109.163:44398.service: Deactivated successfully. Feb 13 15:59:30.170020 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 15:59:30.170922 systemd-logind[1544]: Session 23 logged out. Waiting for processes to exit. Feb 13 15:59:30.171469 systemd-logind[1544]: Removed session 23. Feb 13 15:59:35.175305 systemd[1]: Started sshd@33-139.178.70.106:22-147.75.109.163:44412.service - OpenSSH per-connection server daemon (147.75.109.163:44412). Feb 13 15:59:35.245053 sshd[6310]: Accepted publickey for core from 147.75.109.163 port 44412 ssh2: RSA SHA256:LJvk3bMKOba32ZMZ0NZF/qAHKkXqchkiz1G/QWjohy4 Feb 13 15:59:35.245871 sshd-session[6310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 15:59:35.249461 systemd-logind[1544]: New session 24 of user core. Feb 13 15:59:35.254655 systemd[1]: Started session-24.scope - Session 24 of User core. Feb 13 15:59:35.543103 sshd[6312]: Connection closed by 147.75.109.163 port 44412 Feb 13 15:59:35.544627 sshd-session[6310]: pam_unix(sshd:session): session closed for user core Feb 13 15:59:35.546747 systemd-logind[1544]: Session 24 logged out. Waiting for processes to exit. Feb 13 15:59:35.547101 systemd[1]: sshd@33-139.178.70.106:22-147.75.109.163:44412.service: Deactivated successfully. Feb 13 15:59:35.549213 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 15:59:35.550947 systemd-logind[1544]: Removed session 24.