Feb 13 19:50:24.735189 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 17:41:03 -00 2025 Feb 13 19:50:24.735206 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=015d1d9e5e601f6a4e226c935072d3d0819e7eb2da20e68715973498f21aa3fe Feb 13 19:50:24.735212 kernel: Disabled fast string operations Feb 13 19:50:24.735217 kernel: BIOS-provided physical RAM map: Feb 13 19:50:24.735221 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Feb 13 19:50:24.735225 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Feb 13 19:50:24.735231 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Feb 13 19:50:24.735235 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Feb 13 19:50:24.735239 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Feb 13 19:50:24.735244 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Feb 13 19:50:24.735248 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Feb 13 19:50:24.735252 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Feb 13 19:50:24.735256 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Feb 13 19:50:24.735261 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 19:50:24.735267 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Feb 13 19:50:24.735272 kernel: NX (Execute Disable) protection: active Feb 13 19:50:24.735277 kernel: APIC: Static calls initialized Feb 13 19:50:24.735282 kernel: SMBIOS 2.7 present. Feb 13 19:50:24.735287 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Feb 13 19:50:24.735291 kernel: vmware: hypercall mode: 0x00 Feb 13 19:50:24.735296 kernel: Hypervisor detected: VMware Feb 13 19:50:24.735301 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Feb 13 19:50:24.735306 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Feb 13 19:50:24.735311 kernel: vmware: using clock offset of 2469193083 ns Feb 13 19:50:24.735316 kernel: tsc: Detected 3408.000 MHz processor Feb 13 19:50:24.735321 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 19:50:24.735326 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 19:50:24.735331 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Feb 13 19:50:24.735336 kernel: total RAM covered: 3072M Feb 13 19:50:24.735341 kernel: Found optimal setting for mtrr clean up Feb 13 19:50:24.735347 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Feb 13 19:50:24.735352 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Feb 13 19:50:24.735358 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 19:50:24.735362 kernel: Using GB pages for direct mapping Feb 13 19:50:24.735367 kernel: ACPI: Early table checksum verification disabled Feb 13 19:50:24.735372 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Feb 13 19:50:24.735377 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Feb 13 19:50:24.735382 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Feb 13 19:50:24.735387 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Feb 13 19:50:24.735392 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 19:50:24.735668 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 19:50:24.735677 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Feb 13 19:50:24.735682 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Feb 13 19:50:24.735688 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Feb 13 19:50:24.735693 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Feb 13 19:50:24.735698 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Feb 13 19:50:24.735705 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Feb 13 19:50:24.735711 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Feb 13 19:50:24.735716 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Feb 13 19:50:24.735721 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 19:50:24.735726 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 19:50:24.735731 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Feb 13 19:50:24.735736 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Feb 13 19:50:24.735742 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Feb 13 19:50:24.735747 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Feb 13 19:50:24.735753 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Feb 13 19:50:24.735759 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Feb 13 19:50:24.735764 kernel: system APIC only can use physical flat Feb 13 19:50:24.735769 kernel: APIC: Switched APIC routing to: physical flat Feb 13 19:50:24.735774 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 19:50:24.735779 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Feb 13 19:50:24.735785 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Feb 13 19:50:24.735790 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Feb 13 19:50:24.735795 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Feb 13 19:50:24.735800 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Feb 13 19:50:24.735806 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Feb 13 19:50:24.735811 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Feb 13 19:50:24.735816 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Feb 13 19:50:24.735821 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Feb 13 19:50:24.735826 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Feb 13 19:50:24.735831 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Feb 13 19:50:24.735836 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Feb 13 19:50:24.735841 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Feb 13 19:50:24.735846 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Feb 13 19:50:24.735851 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Feb 13 19:50:24.735857 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Feb 13 19:50:24.735862 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Feb 13 19:50:24.735867 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Feb 13 19:50:24.735872 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Feb 13 19:50:24.735877 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Feb 13 19:50:24.735882 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Feb 13 19:50:24.735887 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Feb 13 19:50:24.735892 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Feb 13 19:50:24.735897 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Feb 13 19:50:24.735902 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Feb 13 19:50:24.735908 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Feb 13 19:50:24.735914 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Feb 13 19:50:24.735919 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Feb 13 19:50:24.735924 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Feb 13 19:50:24.735929 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Feb 13 19:50:24.735934 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Feb 13 19:50:24.735939 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Feb 13 19:50:24.735943 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Feb 13 19:50:24.735949 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Feb 13 19:50:24.735954 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Feb 13 19:50:24.735960 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Feb 13 19:50:24.735965 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Feb 13 19:50:24.735970 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Feb 13 19:50:24.735975 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Feb 13 19:50:24.735980 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Feb 13 19:50:24.735985 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Feb 13 19:50:24.735990 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Feb 13 19:50:24.735995 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Feb 13 19:50:24.736000 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Feb 13 19:50:24.736005 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Feb 13 19:50:24.736011 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Feb 13 19:50:24.736016 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Feb 13 19:50:24.736021 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Feb 13 19:50:24.736026 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Feb 13 19:50:24.736031 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Feb 13 19:50:24.736036 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Feb 13 19:50:24.736041 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Feb 13 19:50:24.736046 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Feb 13 19:50:24.736051 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Feb 13 19:50:24.736056 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Feb 13 19:50:24.736061 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Feb 13 19:50:24.736067 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Feb 13 19:50:24.736072 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Feb 13 19:50:24.736081 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Feb 13 19:50:24.736088 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Feb 13 19:50:24.736093 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Feb 13 19:50:24.736098 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Feb 13 19:50:24.736104 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Feb 13 19:50:24.736109 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Feb 13 19:50:24.736115 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Feb 13 19:50:24.736121 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Feb 13 19:50:24.736126 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Feb 13 19:50:24.736132 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Feb 13 19:50:24.736137 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Feb 13 19:50:24.736142 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Feb 13 19:50:24.736147 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Feb 13 19:50:24.736153 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Feb 13 19:50:24.736158 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Feb 13 19:50:24.736163 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Feb 13 19:50:24.736170 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Feb 13 19:50:24.736175 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Feb 13 19:50:24.736181 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Feb 13 19:50:24.736186 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Feb 13 19:50:24.736191 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Feb 13 19:50:24.736197 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Feb 13 19:50:24.736202 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Feb 13 19:50:24.736207 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Feb 13 19:50:24.736213 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Feb 13 19:50:24.736218 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Feb 13 19:50:24.736225 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Feb 13 19:50:24.736230 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Feb 13 19:50:24.736235 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Feb 13 19:50:24.736240 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Feb 13 19:50:24.736246 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Feb 13 19:50:24.736251 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Feb 13 19:50:24.736256 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Feb 13 19:50:24.736262 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Feb 13 19:50:24.736267 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Feb 13 19:50:24.736272 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Feb 13 19:50:24.736277 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Feb 13 19:50:24.736284 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Feb 13 19:50:24.736289 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Feb 13 19:50:24.736295 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Feb 13 19:50:24.736300 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Feb 13 19:50:24.736305 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Feb 13 19:50:24.736311 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Feb 13 19:50:24.736316 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Feb 13 19:50:24.736321 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Feb 13 19:50:24.736326 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Feb 13 19:50:24.736332 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Feb 13 19:50:24.736338 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Feb 13 19:50:24.736343 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Feb 13 19:50:24.736349 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Feb 13 19:50:24.736354 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Feb 13 19:50:24.736359 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Feb 13 19:50:24.736365 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Feb 13 19:50:24.736370 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Feb 13 19:50:24.736376 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Feb 13 19:50:24.736381 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Feb 13 19:50:24.736386 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Feb 13 19:50:24.736393 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Feb 13 19:50:24.736398 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Feb 13 19:50:24.736415 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Feb 13 19:50:24.736420 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Feb 13 19:50:24.736426 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Feb 13 19:50:24.736431 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Feb 13 19:50:24.736437 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Feb 13 19:50:24.736442 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Feb 13 19:50:24.736447 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Feb 13 19:50:24.736452 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Feb 13 19:50:24.736460 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Feb 13 19:50:24.736465 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Feb 13 19:50:24.736470 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 19:50:24.736476 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 19:50:24.736481 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Feb 13 19:50:24.736487 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Feb 13 19:50:24.736493 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Feb 13 19:50:24.736499 kernel: Zone ranges: Feb 13 19:50:24.736504 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 19:50:24.736509 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Feb 13 19:50:24.736516 kernel: Normal empty Feb 13 19:50:24.736522 kernel: Movable zone start for each node Feb 13 19:50:24.736527 kernel: Early memory node ranges Feb 13 19:50:24.736532 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Feb 13 19:50:24.736538 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Feb 13 19:50:24.736543 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Feb 13 19:50:24.736549 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Feb 13 19:50:24.736555 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 19:50:24.736560 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Feb 13 19:50:24.736567 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Feb 13 19:50:24.736572 kernel: ACPI: PM-Timer IO Port: 0x1008 Feb 13 19:50:24.736578 kernel: system APIC only can use physical flat Feb 13 19:50:24.736583 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Feb 13 19:50:24.736588 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 19:50:24.736594 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 19:50:24.736599 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 19:50:24.736605 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 19:50:24.736610 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 19:50:24.736615 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 19:50:24.736622 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 19:50:24.736627 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 19:50:24.736633 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 19:50:24.736638 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 19:50:24.736643 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 19:50:24.736649 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 19:50:24.736654 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 19:50:24.736660 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 19:50:24.736665 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 19:50:24.736671 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 19:50:24.736677 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Feb 13 19:50:24.736682 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Feb 13 19:50:24.736688 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Feb 13 19:50:24.736697 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Feb 13 19:50:24.736703 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Feb 13 19:50:24.736708 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Feb 13 19:50:24.736714 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Feb 13 19:50:24.736719 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Feb 13 19:50:24.736724 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Feb 13 19:50:24.736731 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Feb 13 19:50:24.736737 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Feb 13 19:50:24.736742 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Feb 13 19:50:24.736747 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Feb 13 19:50:24.736753 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Feb 13 19:50:24.736758 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Feb 13 19:50:24.736763 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Feb 13 19:50:24.736769 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Feb 13 19:50:24.736774 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Feb 13 19:50:24.736780 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Feb 13 19:50:24.736786 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Feb 13 19:50:24.736792 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Feb 13 19:50:24.736797 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Feb 13 19:50:24.736803 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Feb 13 19:50:24.736808 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Feb 13 19:50:24.736813 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Feb 13 19:50:24.736819 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Feb 13 19:50:24.736824 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Feb 13 19:50:24.736829 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Feb 13 19:50:24.736835 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Feb 13 19:50:24.736841 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Feb 13 19:50:24.736847 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Feb 13 19:50:24.736852 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Feb 13 19:50:24.736858 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Feb 13 19:50:24.736863 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Feb 13 19:50:24.736868 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Feb 13 19:50:24.736874 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Feb 13 19:50:24.736879 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Feb 13 19:50:24.736885 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Feb 13 19:50:24.736890 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Feb 13 19:50:24.736897 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Feb 13 19:50:24.736902 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Feb 13 19:50:24.736908 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Feb 13 19:50:24.736913 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Feb 13 19:50:24.736918 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Feb 13 19:50:24.736924 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Feb 13 19:50:24.736929 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Feb 13 19:50:24.736934 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Feb 13 19:50:24.736940 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Feb 13 19:50:24.736946 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Feb 13 19:50:24.736952 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Feb 13 19:50:24.736957 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Feb 13 19:50:24.736962 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Feb 13 19:50:24.736968 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Feb 13 19:50:24.736973 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Feb 13 19:50:24.736979 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Feb 13 19:50:24.736984 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Feb 13 19:50:24.736990 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Feb 13 19:50:24.736995 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Feb 13 19:50:24.737001 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Feb 13 19:50:24.737007 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Feb 13 19:50:24.737012 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Feb 13 19:50:24.737018 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Feb 13 19:50:24.737023 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Feb 13 19:50:24.737028 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Feb 13 19:50:24.737034 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Feb 13 19:50:24.737039 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Feb 13 19:50:24.737044 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Feb 13 19:50:24.737050 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Feb 13 19:50:24.737056 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Feb 13 19:50:24.737062 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Feb 13 19:50:24.737067 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Feb 13 19:50:24.737073 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Feb 13 19:50:24.737078 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Feb 13 19:50:24.737084 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Feb 13 19:50:24.737089 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Feb 13 19:50:24.737094 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Feb 13 19:50:24.737100 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Feb 13 19:50:24.737106 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Feb 13 19:50:24.737112 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Feb 13 19:50:24.737117 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Feb 13 19:50:24.737123 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Feb 13 19:50:24.737128 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Feb 13 19:50:24.737134 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Feb 13 19:50:24.737139 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Feb 13 19:50:24.737144 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Feb 13 19:50:24.737150 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Feb 13 19:50:24.737155 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Feb 13 19:50:24.737162 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Feb 13 19:50:24.737167 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Feb 13 19:50:24.737173 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Feb 13 19:50:24.737178 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Feb 13 19:50:24.737183 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Feb 13 19:50:24.737189 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Feb 13 19:50:24.737194 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Feb 13 19:50:24.737199 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Feb 13 19:50:24.737205 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Feb 13 19:50:24.737210 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Feb 13 19:50:24.737217 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Feb 13 19:50:24.737223 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Feb 13 19:50:24.737228 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Feb 13 19:50:24.737233 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Feb 13 19:50:24.737239 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Feb 13 19:50:24.737244 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Feb 13 19:50:24.737250 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Feb 13 19:50:24.737255 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Feb 13 19:50:24.737260 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Feb 13 19:50:24.737267 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Feb 13 19:50:24.737272 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Feb 13 19:50:24.737278 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Feb 13 19:50:24.737283 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Feb 13 19:50:24.737289 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Feb 13 19:50:24.737294 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Feb 13 19:50:24.737299 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Feb 13 19:50:24.737305 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 19:50:24.737310 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Feb 13 19:50:24.737316 kernel: TSC deadline timer available Feb 13 19:50:24.737322 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Feb 13 19:50:24.737328 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Feb 13 19:50:24.737333 kernel: Booting paravirtualized kernel on VMware hypervisor Feb 13 19:50:24.737339 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 19:50:24.737344 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Feb 13 19:50:24.737350 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Feb 13 19:50:24.737356 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Feb 13 19:50:24.737361 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Feb 13 19:50:24.737367 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Feb 13 19:50:24.737373 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Feb 13 19:50:24.737378 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Feb 13 19:50:24.737384 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Feb 13 19:50:24.737396 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Feb 13 19:50:24.737438 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Feb 13 19:50:24.737444 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Feb 13 19:50:24.737450 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Feb 13 19:50:24.737456 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Feb 13 19:50:24.737464 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Feb 13 19:50:24.737470 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Feb 13 19:50:24.737475 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Feb 13 19:50:24.737481 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Feb 13 19:50:24.737487 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Feb 13 19:50:24.737493 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Feb 13 19:50:24.737499 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=015d1d9e5e601f6a4e226c935072d3d0819e7eb2da20e68715973498f21aa3fe Feb 13 19:50:24.737505 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 19:50:24.737512 kernel: random: crng init done Feb 13 19:50:24.737518 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Feb 13 19:50:24.737524 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Feb 13 19:50:24.737530 kernel: printk: log_buf_len min size: 262144 bytes Feb 13 19:50:24.737536 kernel: printk: log_buf_len: 1048576 bytes Feb 13 19:50:24.737542 kernel: printk: early log buf free: 239648(91%) Feb 13 19:50:24.737548 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 19:50:24.737554 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 19:50:24.737560 kernel: Fallback order for Node 0: 0 Feb 13 19:50:24.737571 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Feb 13 19:50:24.737577 kernel: Policy zone: DMA32 Feb 13 19:50:24.737582 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 19:50:24.737589 kernel: Memory: 1934320K/2096628K available (14336K kernel code, 2301K rwdata, 22800K rodata, 43320K init, 1752K bss, 162048K reserved, 0K cma-reserved) Feb 13 19:50:24.737596 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Feb 13 19:50:24.737602 kernel: ftrace: allocating 37893 entries in 149 pages Feb 13 19:50:24.737609 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 19:50:24.737615 kernel: Dynamic Preempt: voluntary Feb 13 19:50:24.737620 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 19:50:24.737627 kernel: rcu: RCU event tracing is enabled. Feb 13 19:50:24.737634 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Feb 13 19:50:24.737640 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 19:50:24.737646 kernel: Rude variant of Tasks RCU enabled. Feb 13 19:50:24.737652 kernel: Tracing variant of Tasks RCU enabled. Feb 13 19:50:24.737658 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 19:50:24.737665 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Feb 13 19:50:24.737670 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Feb 13 19:50:24.737676 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Feb 13 19:50:24.737682 kernel: Console: colour VGA+ 80x25 Feb 13 19:50:24.737688 kernel: printk: console [tty0] enabled Feb 13 19:50:24.737694 kernel: printk: console [ttyS0] enabled Feb 13 19:50:24.737700 kernel: ACPI: Core revision 20230628 Feb 13 19:50:24.737705 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Feb 13 19:50:24.737711 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 19:50:24.737717 kernel: x2apic enabled Feb 13 19:50:24.737724 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 19:50:24.737730 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 19:50:24.737736 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 19:50:24.737742 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Feb 13 19:50:24.737748 kernel: Disabled fast string operations Feb 13 19:50:24.737754 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 19:50:24.737759 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 19:50:24.737765 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 19:50:24.737771 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Feb 13 19:50:24.737778 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Feb 13 19:50:24.737784 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Feb 13 19:50:24.737790 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 19:50:24.737796 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 19:50:24.737802 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 19:50:24.737808 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 19:50:24.737814 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 19:50:24.737820 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 19:50:24.737827 kernel: SRBDS: Unknown: Dependent on hypervisor status Feb 13 19:50:24.737833 kernel: GDS: Unknown: Dependent on hypervisor status Feb 13 19:50:24.737838 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 19:50:24.737844 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 19:50:24.737850 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 19:50:24.737856 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 19:50:24.737862 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Feb 13 19:50:24.737868 kernel: Freeing SMP alternatives memory: 32K Feb 13 19:50:24.737873 kernel: pid_max: default: 131072 minimum: 1024 Feb 13 19:50:24.737880 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 19:50:24.737886 kernel: landlock: Up and running. Feb 13 19:50:24.737892 kernel: SELinux: Initializing. Feb 13 19:50:24.737898 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.737904 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.737910 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 19:50:24.737916 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:50:24.737922 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:50:24.737928 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:50:24.737935 kernel: Performance Events: Skylake events, core PMU driver. Feb 13 19:50:24.737941 kernel: core: CPUID marked event: 'cpu cycles' unavailable Feb 13 19:50:24.737947 kernel: core: CPUID marked event: 'instructions' unavailable Feb 13 19:50:24.737952 kernel: core: CPUID marked event: 'bus cycles' unavailable Feb 13 19:50:24.737958 kernel: core: CPUID marked event: 'cache references' unavailable Feb 13 19:50:24.737964 kernel: core: CPUID marked event: 'cache misses' unavailable Feb 13 19:50:24.737969 kernel: core: CPUID marked event: 'branch instructions' unavailable Feb 13 19:50:24.737975 kernel: core: CPUID marked event: 'branch misses' unavailable Feb 13 19:50:24.737982 kernel: ... version: 1 Feb 13 19:50:24.737988 kernel: ... bit width: 48 Feb 13 19:50:24.737994 kernel: ... generic registers: 4 Feb 13 19:50:24.738000 kernel: ... value mask: 0000ffffffffffff Feb 13 19:50:24.738005 kernel: ... max period: 000000007fffffff Feb 13 19:50:24.738011 kernel: ... fixed-purpose events: 0 Feb 13 19:50:24.738017 kernel: ... event mask: 000000000000000f Feb 13 19:50:24.738023 kernel: signal: max sigframe size: 1776 Feb 13 19:50:24.738029 kernel: rcu: Hierarchical SRCU implementation. Feb 13 19:50:24.738036 kernel: rcu: Max phase no-delay instances is 400. Feb 13 19:50:24.738043 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 19:50:24.738049 kernel: smp: Bringing up secondary CPUs ... Feb 13 19:50:24.738055 kernel: smpboot: x86: Booting SMP configuration: Feb 13 19:50:24.738061 kernel: .... node #0, CPUs: #1 Feb 13 19:50:24.738066 kernel: Disabled fast string operations Feb 13 19:50:24.738072 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Feb 13 19:50:24.738078 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Feb 13 19:50:24.738084 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 19:50:24.738089 kernel: smpboot: Max logical packages: 128 Feb 13 19:50:24.738095 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Feb 13 19:50:24.738102 kernel: devtmpfs: initialized Feb 13 19:50:24.738108 kernel: x86/mm: Memory block size: 128MB Feb 13 19:50:24.738114 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Feb 13 19:50:24.738120 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 19:50:24.738126 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Feb 13 19:50:24.738131 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 19:50:24.738137 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 19:50:24.738143 kernel: audit: initializing netlink subsys (disabled) Feb 13 19:50:24.738151 kernel: audit: type=2000 audit(1739476223.071:1): state=initialized audit_enabled=0 res=1 Feb 13 19:50:24.738156 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 19:50:24.738162 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 19:50:24.738168 kernel: cpuidle: using governor menu Feb 13 19:50:24.738174 kernel: Simple Boot Flag at 0x36 set to 0x80 Feb 13 19:50:24.738180 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 19:50:24.738185 kernel: dca service started, version 1.12.1 Feb 13 19:50:24.738191 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Feb 13 19:50:24.738197 kernel: PCI: Using configuration type 1 for base access Feb 13 19:50:24.738203 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 19:50:24.738210 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 19:50:24.738216 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 19:50:24.738222 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 19:50:24.738228 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 19:50:24.738234 kernel: ACPI: Added _OSI(Module Device) Feb 13 19:50:24.738239 kernel: ACPI: Added _OSI(Processor Device) Feb 13 19:50:24.738245 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 19:50:24.738251 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 19:50:24.738257 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 19:50:24.738264 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Feb 13 19:50:24.738270 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 19:50:24.738276 kernel: ACPI: Interpreter enabled Feb 13 19:50:24.738282 kernel: ACPI: PM: (supports S0 S1 S5) Feb 13 19:50:24.738287 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 19:50:24.738293 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 19:50:24.738299 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 19:50:24.738305 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Feb 13 19:50:24.738312 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Feb 13 19:50:24.738393 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 19:50:24.738463 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Feb 13 19:50:24.738513 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Feb 13 19:50:24.738521 kernel: PCI host bridge to bus 0000:00 Feb 13 19:50:24.738571 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 19:50:24.738616 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Feb 13 19:50:24.738663 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 13 19:50:24.738719 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 19:50:24.738764 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Feb 13 19:50:24.738807 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Feb 13 19:50:24.738866 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Feb 13 19:50:24.738921 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Feb 13 19:50:24.738980 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Feb 13 19:50:24.739034 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Feb 13 19:50:24.739084 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Feb 13 19:50:24.739133 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 19:50:24.739182 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 19:50:24.739231 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 19:50:24.739279 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 19:50:24.739378 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Feb 13 19:50:24.739452 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Feb 13 19:50:24.739504 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Feb 13 19:50:24.739557 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Feb 13 19:50:24.739606 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Feb 13 19:50:24.739655 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Feb 13 19:50:24.739715 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Feb 13 19:50:24.739764 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Feb 13 19:50:24.739813 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Feb 13 19:50:24.739861 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Feb 13 19:50:24.739910 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Feb 13 19:50:24.739959 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 19:50:24.740012 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Feb 13 19:50:24.740069 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740120 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740193 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740293 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740378 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740812 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740876 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740929 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740983 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.741035 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.741090 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.741141 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742481 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742543 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742601 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742654 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742710 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742761 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742819 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742873 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744439 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744505 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744563 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744616 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744674 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744730 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744784 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744834 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744890 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744942 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745000 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745051 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745105 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745155 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745208 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745259 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745315 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745365 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.747443 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.747508 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.747567 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748477 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748539 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748596 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748655 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748713 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748769 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748820 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748874 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748928 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748982 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749032 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749086 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749136 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749190 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749244 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749297 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749347 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749415 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749474 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749527 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749580 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749636 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749687 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749741 kernel: pci_bus 0000:01: extended config space not accessible Feb 13 19:50:24.749793 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 19:50:24.749845 kernel: pci_bus 0000:02: extended config space not accessible Feb 13 19:50:24.749855 kernel: acpiphp: Slot [32] registered Feb 13 19:50:24.749863 kernel: acpiphp: Slot [33] registered Feb 13 19:50:24.749869 kernel: acpiphp: Slot [34] registered Feb 13 19:50:24.749875 kernel: acpiphp: Slot [35] registered Feb 13 19:50:24.749881 kernel: acpiphp: Slot [36] registered Feb 13 19:50:24.749887 kernel: acpiphp: Slot [37] registered Feb 13 19:50:24.749893 kernel: acpiphp: Slot [38] registered Feb 13 19:50:24.749899 kernel: acpiphp: Slot [39] registered Feb 13 19:50:24.749905 kernel: acpiphp: Slot [40] registered Feb 13 19:50:24.749911 kernel: acpiphp: Slot [41] registered Feb 13 19:50:24.749918 kernel: acpiphp: Slot [42] registered Feb 13 19:50:24.749924 kernel: acpiphp: Slot [43] registered Feb 13 19:50:24.749930 kernel: acpiphp: Slot [44] registered Feb 13 19:50:24.749936 kernel: acpiphp: Slot [45] registered Feb 13 19:50:24.749941 kernel: acpiphp: Slot [46] registered Feb 13 19:50:24.749947 kernel: acpiphp: Slot [47] registered Feb 13 19:50:24.749953 kernel: acpiphp: Slot [48] registered Feb 13 19:50:24.749959 kernel: acpiphp: Slot [49] registered Feb 13 19:50:24.749964 kernel: acpiphp: Slot [50] registered Feb 13 19:50:24.749970 kernel: acpiphp: Slot [51] registered Feb 13 19:50:24.749977 kernel: acpiphp: Slot [52] registered Feb 13 19:50:24.749983 kernel: acpiphp: Slot [53] registered Feb 13 19:50:24.749989 kernel: acpiphp: Slot [54] registered Feb 13 19:50:24.749995 kernel: acpiphp: Slot [55] registered Feb 13 19:50:24.750001 kernel: acpiphp: Slot [56] registered Feb 13 19:50:24.750007 kernel: acpiphp: Slot [57] registered Feb 13 19:50:24.750013 kernel: acpiphp: Slot [58] registered Feb 13 19:50:24.750018 kernel: acpiphp: Slot [59] registered Feb 13 19:50:24.750025 kernel: acpiphp: Slot [60] registered Feb 13 19:50:24.750032 kernel: acpiphp: Slot [61] registered Feb 13 19:50:24.750038 kernel: acpiphp: Slot [62] registered Feb 13 19:50:24.750044 kernel: acpiphp: Slot [63] registered Feb 13 19:50:24.750094 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Feb 13 19:50:24.750144 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 19:50:24.750193 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 19:50:24.750241 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:50:24.750290 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Feb 13 19:50:24.750341 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Feb 13 19:50:24.750390 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Feb 13 19:50:24.755478 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Feb 13 19:50:24.755541 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Feb 13 19:50:24.755602 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Feb 13 19:50:24.755656 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Feb 13 19:50:24.755708 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Feb 13 19:50:24.755763 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 19:50:24.755814 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.755866 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 19:50:24.755919 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 19:50:24.755969 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 19:50:24.756019 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 19:50:24.756071 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 19:50:24.756121 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 19:50:24.756174 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 19:50:24.756223 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:50:24.756275 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 19:50:24.756325 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 19:50:24.756374 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 19:50:24.756437 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:50:24.756490 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 19:50:24.756544 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 19:50:24.756594 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:50:24.756645 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 19:50:24.756694 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 19:50:24.756744 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:50:24.756798 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 19:50:24.756848 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 19:50:24.756897 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:50:24.756949 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 19:50:24.756999 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 19:50:24.757048 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:50:24.757100 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 19:50:24.757150 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 19:50:24.757203 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:50:24.757261 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Feb 13 19:50:24.757313 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Feb 13 19:50:24.757364 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Feb 13 19:50:24.759476 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Feb 13 19:50:24.759554 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Feb 13 19:50:24.759610 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 19:50:24.759668 kernel: pci 0000:0b:00.0: supports D1 D2 Feb 13 19:50:24.759720 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 19:50:24.759772 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 19:50:24.759825 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 19:50:24.759877 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 19:50:24.759927 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 19:50:24.759979 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 19:50:24.760030 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 19:50:24.760083 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 19:50:24.760133 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:50:24.760185 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 19:50:24.760236 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 19:50:24.760286 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 19:50:24.760335 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:50:24.760387 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 19:50:24.760503 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 19:50:24.760555 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:50:24.760607 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 19:50:24.760657 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 19:50:24.760709 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:50:24.760761 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 19:50:24.760810 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 19:50:24.760859 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:50:24.760914 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 19:50:24.760963 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 19:50:24.761012 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:50:24.761064 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 19:50:24.761113 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 19:50:24.761162 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:50:24.761213 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 19:50:24.761263 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 19:50:24.761312 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 19:50:24.761364 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:50:24.761435 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 19:50:24.761487 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 19:50:24.761537 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 19:50:24.761587 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:50:24.761639 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 19:50:24.761689 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 19:50:24.761747 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 19:50:24.761797 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:50:24.761848 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 19:50:24.761898 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 19:50:24.761948 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:50:24.761998 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 19:50:24.762048 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 19:50:24.762098 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:50:24.762152 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 19:50:24.762202 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 19:50:24.762252 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:50:24.762303 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 19:50:24.762353 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 19:50:24.762409 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:50:24.762479 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 19:50:24.762529 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 19:50:24.762581 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:50:24.762633 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 19:50:24.762683 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 19:50:24.762732 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 19:50:24.762782 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:50:24.762834 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 19:50:24.762883 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 19:50:24.762935 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 19:50:24.762984 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:50:24.763036 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 19:50:24.763086 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 19:50:24.763135 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:50:24.763187 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 19:50:24.763236 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 19:50:24.763285 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:50:24.763339 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 19:50:24.763388 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 19:50:24.763478 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:50:24.763529 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 19:50:24.763580 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 19:50:24.763629 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:50:24.763679 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 19:50:24.763728 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 19:50:24.763780 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:50:24.763831 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 19:50:24.763880 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 19:50:24.763929 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:50:24.763938 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Feb 13 19:50:24.763944 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Feb 13 19:50:24.763950 kernel: ACPI: PCI: Interrupt link LNKB disabled Feb 13 19:50:24.763956 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 19:50:24.763962 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Feb 13 19:50:24.763971 kernel: iommu: Default domain type: Translated Feb 13 19:50:24.763977 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 19:50:24.763983 kernel: PCI: Using ACPI for IRQ routing Feb 13 19:50:24.763989 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 19:50:24.763995 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Feb 13 19:50:24.764001 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Feb 13 19:50:24.764051 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Feb 13 19:50:24.764100 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Feb 13 19:50:24.764149 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 19:50:24.764160 kernel: vgaarb: loaded Feb 13 19:50:24.764166 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Feb 13 19:50:24.764172 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Feb 13 19:50:24.764178 kernel: clocksource: Switched to clocksource tsc-early Feb 13 19:50:24.764184 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 19:50:24.764191 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 19:50:24.764197 kernel: pnp: PnP ACPI init Feb 13 19:50:24.764250 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Feb 13 19:50:24.764299 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Feb 13 19:50:24.764344 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Feb 13 19:50:24.764393 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Feb 13 19:50:24.764475 kernel: pnp 00:06: [dma 2] Feb 13 19:50:24.764525 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Feb 13 19:50:24.764570 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Feb 13 19:50:24.764617 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Feb 13 19:50:24.764626 kernel: pnp: PnP ACPI: found 8 devices Feb 13 19:50:24.764632 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 19:50:24.764638 kernel: NET: Registered PF_INET protocol family Feb 13 19:50:24.764644 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 19:50:24.764650 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 19:50:24.764656 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 19:50:24.764662 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 19:50:24.764668 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 19:50:24.764676 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 19:50:24.764682 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.764688 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.764697 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 19:50:24.764703 kernel: NET: Registered PF_XDP protocol family Feb 13 19:50:24.764755 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Feb 13 19:50:24.764806 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Feb 13 19:50:24.764857 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Feb 13 19:50:24.764910 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Feb 13 19:50:24.764961 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Feb 13 19:50:24.765011 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Feb 13 19:50:24.765061 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Feb 13 19:50:24.765111 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Feb 13 19:50:24.765162 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Feb 13 19:50:24.765214 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Feb 13 19:50:24.765264 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Feb 13 19:50:24.765314 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Feb 13 19:50:24.765364 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Feb 13 19:50:24.765420 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Feb 13 19:50:24.765473 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Feb 13 19:50:24.765523 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Feb 13 19:50:24.765573 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Feb 13 19:50:24.765622 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Feb 13 19:50:24.765671 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Feb 13 19:50:24.765721 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Feb 13 19:50:24.765789 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Feb 13 19:50:24.765839 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Feb 13 19:50:24.765888 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Feb 13 19:50:24.765938 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:50:24.765987 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:50:24.766036 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766085 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766137 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766187 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766236 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766285 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766334 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766384 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766460 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766510 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766562 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766611 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766660 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766709 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766758 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766806 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766856 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766906 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766958 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767007 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767057 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767105 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767154 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767203 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767252 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767300 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767352 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767407 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767458 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767508 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767557 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767606 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767663 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767718 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767772 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767822 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767872 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767922 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767971 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768021 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768070 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768119 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768171 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768220 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768270 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768320 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768369 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768461 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768511 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768560 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768609 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768657 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768714 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768778 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768825 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768881 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768929 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768977 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769025 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769074 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769122 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769174 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769223 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769272 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769320 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769369 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769440 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769490 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769539 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769587 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769639 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769688 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769737 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769785 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769833 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769881 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769929 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769997 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770046 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.770095 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770147 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.770195 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770244 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.770293 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770344 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 19:50:24.770394 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Feb 13 19:50:24.770490 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 19:50:24.770539 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 19:50:24.770587 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:50:24.770643 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Feb 13 19:50:24.770727 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 19:50:24.770791 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 19:50:24.770839 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 19:50:24.770887 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:50:24.770937 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 19:50:24.770985 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 19:50:24.771034 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 19:50:24.771085 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:50:24.771135 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 19:50:24.771184 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 19:50:24.771233 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 19:50:24.771282 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:50:24.771330 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 19:50:24.771378 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 19:50:24.771531 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:50:24.771582 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 19:50:24.771630 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 19:50:24.771682 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:50:24.771737 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 19:50:24.771786 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 19:50:24.771835 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:50:24.771882 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 19:50:24.771933 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 19:50:24.771984 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:50:24.772033 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 19:50:24.772082 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 19:50:24.772130 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:50:24.772182 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Feb 13 19:50:24.772232 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 19:50:24.772281 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 19:50:24.772331 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 19:50:24.772380 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:50:24.772444 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 19:50:24.772495 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 19:50:24.772544 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 19:50:24.772593 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:50:24.772642 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 19:50:24.772691 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 19:50:24.772769 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 19:50:24.772819 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:50:24.772870 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 19:50:24.772923 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 19:50:24.772973 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:50:24.773022 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 19:50:24.773072 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 19:50:24.773122 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:50:24.773172 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 19:50:24.773222 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 19:50:24.773272 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:50:24.773322 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 19:50:24.773371 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 19:50:24.773431 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:50:24.773483 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 19:50:24.773533 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 19:50:24.773585 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:50:24.773636 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 19:50:24.773687 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 19:50:24.773738 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 19:50:24.773800 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:50:24.773874 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 19:50:24.773938 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 19:50:24.774004 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 19:50:24.774066 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:50:24.774136 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 19:50:24.774197 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 19:50:24.774259 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 19:50:24.774311 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:50:24.774362 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 19:50:24.774437 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 19:50:24.774498 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:50:24.774553 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 19:50:24.774603 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 19:50:24.774653 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:50:24.774707 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 19:50:24.774758 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 19:50:24.774808 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:50:24.774859 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 19:50:24.774908 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 19:50:24.774958 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:50:24.775011 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 19:50:24.775060 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 19:50:24.775109 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:50:24.775160 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 19:50:24.775220 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 19:50:24.775290 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 19:50:24.775356 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:50:24.775481 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 19:50:24.775549 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 19:50:24.775610 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 19:50:24.775683 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:50:24.775742 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 19:50:24.775792 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 19:50:24.775842 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:50:24.775892 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 19:50:24.775941 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 19:50:24.775990 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:50:24.776040 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 19:50:24.776089 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 19:50:24.776142 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:50:24.776192 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 19:50:24.776242 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 19:50:24.776291 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:50:24.776341 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 19:50:24.776390 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 19:50:24.776484 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:50:24.776536 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 19:50:24.776586 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 19:50:24.776636 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:50:24.776688 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 19:50:24.776733 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 19:50:24.776785 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 19:50:24.776848 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Feb 13 19:50:24.776907 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Feb 13 19:50:24.776968 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Feb 13 19:50:24.777033 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Feb 13 19:50:24.777100 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:50:24.777156 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 19:50:24.777215 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 19:50:24.777266 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 19:50:24.777318 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Feb 13 19:50:24.777364 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Feb 13 19:50:24.778453 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Feb 13 19:50:24.778518 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Feb 13 19:50:24.778577 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:50:24.778647 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Feb 13 19:50:24.778719 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Feb 13 19:50:24.778783 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:50:24.778835 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Feb 13 19:50:24.778882 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Feb 13 19:50:24.778941 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:50:24.779009 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Feb 13 19:50:24.779060 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:50:24.779122 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Feb 13 19:50:24.779186 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:50:24.779252 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Feb 13 19:50:24.779303 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:50:24.779366 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Feb 13 19:50:24.780454 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:50:24.780526 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Feb 13 19:50:24.780603 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:50:24.780663 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Feb 13 19:50:24.780718 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Feb 13 19:50:24.780777 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:50:24.780834 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Feb 13 19:50:24.780887 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Feb 13 19:50:24.780941 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:50:24.780998 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Feb 13 19:50:24.781060 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Feb 13 19:50:24.781131 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:50:24.781187 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Feb 13 19:50:24.781241 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:50:24.781296 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Feb 13 19:50:24.781365 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:50:24.781929 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Feb 13 19:50:24.781988 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:50:24.782062 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Feb 13 19:50:24.782116 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:50:24.782190 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Feb 13 19:50:24.782256 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:50:24.782325 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Feb 13 19:50:24.782382 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Feb 13 19:50:24.783294 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:50:24.783367 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Feb 13 19:50:24.783485 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Feb 13 19:50:24.783592 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:50:24.783651 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Feb 13 19:50:24.783701 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Feb 13 19:50:24.783753 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:50:24.783809 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Feb 13 19:50:24.783857 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:50:24.783910 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Feb 13 19:50:24.783958 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:50:24.784011 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Feb 13 19:50:24.784060 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:50:24.784115 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Feb 13 19:50:24.784162 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:50:24.784213 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Feb 13 19:50:24.784260 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:50:24.784314 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Feb 13 19:50:24.784364 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Feb 13 19:50:24.784424 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:50:24.784477 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Feb 13 19:50:24.784536 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Feb 13 19:50:24.784585 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:50:24.784637 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Feb 13 19:50:24.784684 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:50:24.784740 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Feb 13 19:50:24.784787 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:50:24.784838 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Feb 13 19:50:24.784885 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:50:24.784939 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Feb 13 19:50:24.784986 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:50:24.785047 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Feb 13 19:50:24.785096 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:50:24.785158 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Feb 13 19:50:24.785208 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:50:24.785268 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 19:50:24.785279 kernel: PCI: CLS 32 bytes, default 64 Feb 13 19:50:24.785286 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 19:50:24.785296 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 19:50:24.785302 kernel: clocksource: Switched to clocksource tsc Feb 13 19:50:24.785309 kernel: Initialise system trusted keyrings Feb 13 19:50:24.785316 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 19:50:24.785322 kernel: Key type asymmetric registered Feb 13 19:50:24.785328 kernel: Asymmetric key parser 'x509' registered Feb 13 19:50:24.785336 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 19:50:24.785343 kernel: io scheduler mq-deadline registered Feb 13 19:50:24.785349 kernel: io scheduler kyber registered Feb 13 19:50:24.785357 kernel: io scheduler bfq registered Feb 13 19:50:24.786974 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Feb 13 19:50:24.787042 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787100 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Feb 13 19:50:24.787153 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787208 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Feb 13 19:50:24.787284 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787352 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Feb 13 19:50:24.787424 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787492 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Feb 13 19:50:24.787555 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787610 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Feb 13 19:50:24.787690 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787773 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Feb 13 19:50:24.787838 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787902 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Feb 13 19:50:24.787957 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788010 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Feb 13 19:50:24.788070 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788145 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Feb 13 19:50:24.788208 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788266 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Feb 13 19:50:24.788321 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788385 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Feb 13 19:50:24.788479 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788542 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Feb 13 19:50:24.788595 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788648 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Feb 13 19:50:24.788714 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788792 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Feb 13 19:50:24.788849 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788907 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Feb 13 19:50:24.788961 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789019 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Feb 13 19:50:24.789082 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789161 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Feb 13 19:50:24.789216 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789273 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Feb 13 19:50:24.789333 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789573 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Feb 13 19:50:24.789632 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789703 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Feb 13 19:50:24.789762 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789832 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Feb 13 19:50:24.789886 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789940 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Feb 13 19:50:24.790005 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790070 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Feb 13 19:50:24.790131 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790185 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Feb 13 19:50:24.790253 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790335 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Feb 13 19:50:24.790391 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790496 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Feb 13 19:50:24.790575 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790630 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Feb 13 19:50:24.790686 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790758 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Feb 13 19:50:24.790824 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790877 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Feb 13 19:50:24.790936 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.791005 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Feb 13 19:50:24.791069 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.791127 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Feb 13 19:50:24.791186 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.791198 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 19:50:24.791206 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 19:50:24.791212 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 19:50:24.791219 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Feb 13 19:50:24.791225 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 19:50:24.791232 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 19:50:24.791285 kernel: rtc_cmos 00:01: registered as rtc0 Feb 13 19:50:24.791342 kernel: rtc_cmos 00:01: setting system clock to 2025-02-13T19:50:24 UTC (1739476224) Feb 13 19:50:24.791772 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Feb 13 19:50:24.791787 kernel: intel_pstate: CPU model not supported Feb 13 19:50:24.791799 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 19:50:24.791811 kernel: NET: Registered PF_INET6 protocol family Feb 13 19:50:24.791823 kernel: Segment Routing with IPv6 Feb 13 19:50:24.791833 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 19:50:24.791844 kernel: NET: Registered PF_PACKET protocol family Feb 13 19:50:24.791851 kernel: Key type dns_resolver registered Feb 13 19:50:24.791862 kernel: IPI shorthand broadcast: enabled Feb 13 19:50:24.791870 kernel: sched_clock: Marking stable (936180487, 233095616)->(1232030088, -62753985) Feb 13 19:50:24.791886 kernel: registered taskstats version 1 Feb 13 19:50:24.791897 kernel: Loading compiled-in X.509 certificates Feb 13 19:50:24.791908 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: b3acedbed401b3cd9632ee9302ddcce254d8924d' Feb 13 19:50:24.791919 kernel: Key type .fscrypt registered Feb 13 19:50:24.791928 kernel: Key type fscrypt-provisioning registered Feb 13 19:50:24.791934 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 19:50:24.791940 kernel: ima: Allocated hash algorithm: sha1 Feb 13 19:50:24.791951 kernel: ima: No architecture policies found Feb 13 19:50:24.791957 kernel: clk: Disabling unused clocks Feb 13 19:50:24.791964 kernel: Freeing unused kernel image (initmem) memory: 43320K Feb 13 19:50:24.791972 kernel: Write protecting the kernel read-only data: 38912k Feb 13 19:50:24.791979 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Feb 13 19:50:24.791985 kernel: Run /init as init process Feb 13 19:50:24.791991 kernel: with arguments: Feb 13 19:50:24.792000 kernel: /init Feb 13 19:50:24.792007 kernel: with environment: Feb 13 19:50:24.792016 kernel: HOME=/ Feb 13 19:50:24.792022 kernel: TERM=linux Feb 13 19:50:24.792029 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 19:50:24.792036 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 19:50:24.792045 systemd[1]: Detected virtualization vmware. Feb 13 19:50:24.792053 systemd[1]: Detected architecture x86-64. Feb 13 19:50:24.792061 systemd[1]: Running in initrd. Feb 13 19:50:24.792070 systemd[1]: No hostname configured, using default hostname. Feb 13 19:50:24.792079 systemd[1]: Hostname set to . Feb 13 19:50:24.792086 systemd[1]: Initializing machine ID from random generator. Feb 13 19:50:24.792093 systemd[1]: Queued start job for default target initrd.target. Feb 13 19:50:24.792100 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:50:24.792109 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:50:24.792117 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 19:50:24.792123 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 19:50:24.792132 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 19:50:24.792143 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 19:50:24.792151 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 19:50:24.792160 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 19:50:24.792169 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:50:24.792177 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:50:24.792183 systemd[1]: Reached target paths.target - Path Units. Feb 13 19:50:24.792191 systemd[1]: Reached target slices.target - Slice Units. Feb 13 19:50:24.792204 systemd[1]: Reached target swap.target - Swaps. Feb 13 19:50:24.792213 systemd[1]: Reached target timers.target - Timer Units. Feb 13 19:50:24.792219 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:50:24.792226 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:50:24.792235 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 19:50:24.792242 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Feb 13 19:50:24.792249 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:50:24.792255 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 19:50:24.792262 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:50:24.792271 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 19:50:24.792279 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 19:50:24.792286 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 19:50:24.792294 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 19:50:24.792301 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 19:50:24.792307 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 19:50:24.792314 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 19:50:24.792321 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:50:24.792329 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 19:50:24.792354 systemd-journald[215]: Collecting audit messages is disabled. Feb 13 19:50:24.792378 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:50:24.792389 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 19:50:24.792513 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 19:50:24.792523 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 19:50:24.792531 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 19:50:24.792537 kernel: Bridge firewalling registered Feb 13 19:50:24.792549 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 19:50:24.792556 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:24.792563 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:50:24.792570 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 19:50:24.792576 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 19:50:24.792585 systemd-journald[215]: Journal started Feb 13 19:50:24.792604 systemd-journald[215]: Runtime Journal (/run/log/journal/5b60a719a5654806b06355c06edc54b0) is 4.8M, max 38.6M, 33.8M free. Feb 13 19:50:24.740384 systemd-modules-load[216]: Inserted module 'overlay' Feb 13 19:50:24.794513 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 19:50:24.759534 systemd-modules-load[216]: Inserted module 'br_netfilter' Feb 13 19:50:24.793870 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:50:24.797540 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 19:50:24.798474 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:24.798925 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:50:24.802558 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 19:50:24.804032 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:50:24.806662 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 19:50:24.813104 dracut-cmdline[249]: dracut-dracut-053 Feb 13 19:50:24.814516 dracut-cmdline[249]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=015d1d9e5e601f6a4e226c935072d3d0819e7eb2da20e68715973498f21aa3fe Feb 13 19:50:24.828255 systemd-resolved[251]: Positive Trust Anchors: Feb 13 19:50:24.828267 systemd-resolved[251]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 19:50:24.828290 systemd-resolved[251]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 19:50:24.830104 systemd-resolved[251]: Defaulting to hostname 'linux'. Feb 13 19:50:24.831042 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 19:50:24.831211 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:50:24.865413 kernel: SCSI subsystem initialized Feb 13 19:50:24.871430 kernel: Loading iSCSI transport class v2.0-870. Feb 13 19:50:24.878414 kernel: iscsi: registered transport (tcp) Feb 13 19:50:24.891715 kernel: iscsi: registered transport (qla4xxx) Feb 13 19:50:24.891758 kernel: QLogic iSCSI HBA Driver Feb 13 19:50:24.912940 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 19:50:24.917505 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 19:50:24.934245 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 19:50:24.934301 kernel: device-mapper: uevent: version 1.0.3 Feb 13 19:50:24.934312 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 19:50:24.967419 kernel: raid6: avx2x4 gen() 46415 MB/s Feb 13 19:50:24.982413 kernel: raid6: avx2x2 gen() 53081 MB/s Feb 13 19:50:24.999626 kernel: raid6: avx2x1 gen() 44507 MB/s Feb 13 19:50:24.999671 kernel: raid6: using algorithm avx2x2 gen() 53081 MB/s Feb 13 19:50:25.017659 kernel: raid6: .... xor() 32197 MB/s, rmw enabled Feb 13 19:50:25.017703 kernel: raid6: using avx2x2 recovery algorithm Feb 13 19:50:25.031429 kernel: xor: automatically using best checksumming function avx Feb 13 19:50:25.121420 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 19:50:25.127180 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:50:25.133534 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:50:25.141467 systemd-udevd[434]: Using default interface naming scheme 'v255'. Feb 13 19:50:25.144346 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:50:25.150539 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 19:50:25.157925 dracut-pre-trigger[440]: rd.md=0: removing MD RAID activation Feb 13 19:50:25.175167 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:50:25.179497 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 19:50:25.249964 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:50:25.254891 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 19:50:25.265546 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 19:50:25.265920 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:50:25.266215 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:50:25.266570 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 19:50:25.270483 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 19:50:25.278743 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:50:25.315426 kernel: VMware PVSCSI driver - version 1.0.7.0-k Feb 13 19:50:25.322430 kernel: vmw_pvscsi: using 64bit dma Feb 13 19:50:25.324509 kernel: vmw_pvscsi: max_id: 16 Feb 13 19:50:25.324529 kernel: vmw_pvscsi: setting ring_pages to 8 Feb 13 19:50:25.332683 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Feb 13 19:50:25.332726 kernel: vmw_pvscsi: enabling reqCallThreshold Feb 13 19:50:25.332741 kernel: vmw_pvscsi: driver-based request coalescing enabled Feb 13 19:50:25.332753 kernel: vmw_pvscsi: using MSI-X Feb 13 19:50:25.334568 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Feb 13 19:50:25.336132 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Feb 13 19:50:25.352005 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Feb 13 19:50:25.352090 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Feb 13 19:50:25.352184 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Feb 13 19:50:25.352260 kernel: libata version 3.00 loaded. Feb 13 19:50:25.353421 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 19:50:25.357413 kernel: ata_piix 0000:00:07.1: version 2.13 Feb 13 19:50:25.373663 kernel: scsi host1: ata_piix Feb 13 19:50:25.373739 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Feb 13 19:50:25.373812 kernel: scsi host2: ata_piix Feb 13 19:50:25.373880 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Feb 13 19:50:25.373889 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 19:50:25.373897 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Feb 13 19:50:25.373904 kernel: AES CTR mode by8 optimization enabled Feb 13 19:50:25.359434 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:50:25.361531 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:25.361701 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:50:25.361790 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:50:25.361857 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:25.361957 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:50:25.369288 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:50:25.382494 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:25.397652 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:50:25.409626 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:25.535483 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Feb 13 19:50:25.541416 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Feb 13 19:50:25.549571 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Feb 13 19:50:25.583752 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 19:50:25.583856 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Feb 13 19:50:25.583940 kernel: sd 0:0:0:0: [sda] Cache data unavailable Feb 13 19:50:25.584020 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Feb 13 19:50:25.584100 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:50:25.584112 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 19:50:25.597961 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Feb 13 19:50:25.610365 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 13 19:50:25.610380 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Feb 13 19:50:25.636418 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (485) Feb 13 19:50:25.642414 kernel: BTRFS: device fsid c7adc9b8-df7f-4a5f-93bf-204def2767a9 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (494) Feb 13 19:50:25.643111 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Feb 13 19:50:25.646636 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Feb 13 19:50:25.649876 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 19:50:25.652928 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Feb 13 19:50:25.653062 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Feb 13 19:50:25.657491 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 19:50:25.721424 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:50:26.731432 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:50:26.731471 disk-uuid[595]: The operation has completed successfully. Feb 13 19:50:26.817689 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 19:50:26.818019 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 19:50:26.824503 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 19:50:26.826494 sh[612]: Success Feb 13 19:50:26.835417 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 19:50:26.913891 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 19:50:26.924250 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 19:50:26.924644 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 19:50:26.940649 kernel: BTRFS info (device dm-0): first mount of filesystem c7adc9b8-df7f-4a5f-93bf-204def2767a9 Feb 13 19:50:26.940682 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:26.940691 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 19:50:26.942749 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 19:50:26.942765 kernel: BTRFS info (device dm-0): using free space tree Feb 13 19:50:26.950417 kernel: BTRFS info (device dm-0): enabling ssd optimizations Feb 13 19:50:26.951865 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 19:50:26.959508 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Feb 13 19:50:26.960701 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 19:50:27.025302 kernel: BTRFS info (device sda6): first mount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.025351 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:27.025369 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:50:27.029424 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:50:27.039882 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 19:50:27.042449 kernel: BTRFS info (device sda6): last unmount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.045878 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 19:50:27.049565 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 19:50:27.073627 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 19:50:27.079509 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 19:50:27.135851 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:50:27.143598 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 19:50:27.144752 ignition[673]: Ignition 2.20.0 Feb 13 19:50:27.144900 ignition[673]: Stage: fetch-offline Feb 13 19:50:27.144923 ignition[673]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.144928 ignition[673]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.144976 ignition[673]: parsed url from cmdline: "" Feb 13 19:50:27.144978 ignition[673]: no config URL provided Feb 13 19:50:27.144981 ignition[673]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 19:50:27.144985 ignition[673]: no config at "/usr/lib/ignition/user.ign" Feb 13 19:50:27.145336 ignition[673]: config successfully fetched Feb 13 19:50:27.145353 ignition[673]: parsing config with SHA512: 47d845dd4d4313284ffd0570bdb128208c674c684235bc5d5814183181f8b6aa8252e2217ff10d57a42511567f1f2015461a33f65b692b3b86ddb4c670b7f41f Feb 13 19:50:27.148597 unknown[673]: fetched base config from "system" Feb 13 19:50:27.148724 unknown[673]: fetched user config from "vmware" Feb 13 19:50:27.149075 ignition[673]: fetch-offline: fetch-offline passed Feb 13 19:50:27.149226 ignition[673]: Ignition finished successfully Feb 13 19:50:27.150579 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:50:27.157223 systemd-networkd[805]: lo: Link UP Feb 13 19:50:27.157229 systemd-networkd[805]: lo: Gained carrier Feb 13 19:50:27.157973 systemd-networkd[805]: Enumeration completed Feb 13 19:50:27.158231 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 19:50:27.158264 systemd-networkd[805]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Feb 13 19:50:27.158512 systemd[1]: Reached target network.target - Network. Feb 13 19:50:27.162061 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 19:50:27.162222 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 19:50:27.158856 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 19:50:27.161715 systemd-networkd[805]: ens192: Link UP Feb 13 19:50:27.161718 systemd-networkd[805]: ens192: Gained carrier Feb 13 19:50:27.164511 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 19:50:27.173701 ignition[808]: Ignition 2.20.0 Feb 13 19:50:27.173709 ignition[808]: Stage: kargs Feb 13 19:50:27.173819 ignition[808]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.173825 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.174447 ignition[808]: kargs: kargs passed Feb 13 19:50:27.174476 ignition[808]: Ignition finished successfully Feb 13 19:50:27.175832 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 19:50:27.180574 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 19:50:27.187978 ignition[815]: Ignition 2.20.0 Feb 13 19:50:27.187985 ignition[815]: Stage: disks Feb 13 19:50:27.188513 ignition[815]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.188523 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.189471 ignition[815]: disks: disks passed Feb 13 19:50:27.189510 ignition[815]: Ignition finished successfully Feb 13 19:50:27.190033 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 19:50:27.190438 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 19:50:27.190577 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 19:50:27.190783 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 19:50:27.190977 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 19:50:27.191151 systemd[1]: Reached target basic.target - Basic System. Feb 13 19:50:27.195483 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 19:50:27.206150 systemd-fsck[823]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Feb 13 19:50:27.207673 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 19:50:27.212614 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 19:50:27.316995 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 19:50:27.317423 kernel: EXT4-fs (sda9): mounted filesystem 7d46b70d-4c30-46e6-9935-e1f7fb523560 r/w with ordered data mode. Quota mode: none. Feb 13 19:50:27.317361 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 19:50:27.321459 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:50:27.324357 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 19:50:27.324752 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 19:50:27.324787 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 19:50:27.324807 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:50:27.329119 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 19:50:27.330089 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 19:50:27.378394 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (831) Feb 13 19:50:27.378456 kernel: BTRFS info (device sda6): first mount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.378477 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:27.379470 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:50:27.384751 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:50:27.385320 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:50:27.397925 initrd-setup-root[855]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 19:50:27.400825 initrd-setup-root[862]: cut: /sysroot/etc/group: No such file or directory Feb 13 19:50:27.403136 initrd-setup-root[869]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 19:50:27.405660 initrd-setup-root[876]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 19:50:27.480352 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 19:50:27.486487 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 19:50:27.488942 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 19:50:27.493421 kernel: BTRFS info (device sda6): last unmount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.507141 ignition[944]: INFO : Ignition 2.20.0 Feb 13 19:50:27.508191 ignition[944]: INFO : Stage: mount Feb 13 19:50:27.508191 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.508191 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.508191 ignition[944]: INFO : mount: mount passed Feb 13 19:50:27.508191 ignition[944]: INFO : Ignition finished successfully Feb 13 19:50:27.509762 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 19:50:27.510038 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 19:50:27.514509 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 19:50:27.938824 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 19:50:27.943502 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:50:27.983419 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (956) Feb 13 19:50:27.986370 kernel: BTRFS info (device sda6): first mount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.986417 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:27.986432 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:50:27.991419 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:50:27.992249 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:50:28.011419 ignition[973]: INFO : Ignition 2.20.0 Feb 13 19:50:28.011419 ignition[973]: INFO : Stage: files Feb 13 19:50:28.011419 ignition[973]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:28.011419 ignition[973]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:28.012269 ignition[973]: DEBUG : files: compiled without relabeling support, skipping Feb 13 19:50:28.012432 ignition[973]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 19:50:28.012432 ignition[973]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 19:50:28.014549 ignition[973]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 19:50:28.014692 ignition[973]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 19:50:28.014845 unknown[973]: wrote ssh authorized keys file for user: core Feb 13 19:50:28.015052 ignition[973]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 19:50:28.017213 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 19:50:28.017480 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 19:50:28.074436 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 19:50:28.191437 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Feb 13 19:50:28.675049 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Feb 13 19:50:28.934250 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.934250 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 19:50:28.934660 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Feb 13 19:50:29.011534 systemd-networkd[805]: ens192: Gained IPv6LL Feb 13 19:50:29.012835 ignition[973]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 19:50:29.016606 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:50:29.016606 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:50:29.016606 ignition[973]: INFO : files: files passed Feb 13 19:50:29.016606 ignition[973]: INFO : Ignition finished successfully Feb 13 19:50:29.016170 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 19:50:29.023525 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 19:50:29.025033 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 19:50:29.032184 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:50:29.032184 initrd-setup-root-after-ignition[1002]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:50:29.032997 initrd-setup-root-after-ignition[1006]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:50:29.033889 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:50:29.034229 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 19:50:29.037491 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 19:50:29.037699 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 19:50:29.037744 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 19:50:29.049214 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 19:50:29.049264 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 19:50:29.049813 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 19:50:29.049913 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 19:50:29.050025 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 19:50:29.050962 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 19:50:29.059246 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:50:29.063520 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 19:50:29.069005 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:50:29.069263 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:50:29.069457 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 19:50:29.069595 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 19:50:29.069663 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:50:29.069869 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 19:50:29.070084 systemd[1]: Stopped target basic.target - Basic System. Feb 13 19:50:29.070255 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 19:50:29.070478 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:50:29.070721 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 19:50:29.070910 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 19:50:29.071094 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:50:29.071307 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 19:50:29.071705 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 19:50:29.071911 systemd[1]: Stopped target swap.target - Swaps. Feb 13 19:50:29.072088 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 19:50:29.072149 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:50:29.072464 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:50:29.072617 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:50:29.072797 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 19:50:29.072846 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:50:29.072983 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 19:50:29.073044 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 19:50:29.073298 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 19:50:29.073371 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:50:29.073637 systemd[1]: Stopped target paths.target - Path Units. Feb 13 19:50:29.073787 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 19:50:29.078428 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:50:29.078612 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 19:50:29.078840 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 19:50:29.078992 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 19:50:29.079069 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:50:29.079317 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 19:50:29.079365 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:50:29.079614 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 19:50:29.079675 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:50:29.079937 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 19:50:29.079998 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 19:50:29.085546 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 19:50:29.085654 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 19:50:29.085751 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:50:29.087533 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 19:50:29.087649 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 19:50:29.087743 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:50:29.087962 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 19:50:29.088033 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:50:29.090573 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 19:50:29.090624 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 19:50:29.097154 ignition[1027]: INFO : Ignition 2.20.0 Feb 13 19:50:29.097661 ignition[1027]: INFO : Stage: umount Feb 13 19:50:29.097990 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:29.098909 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:29.098909 ignition[1027]: INFO : umount: umount passed Feb 13 19:50:29.098909 ignition[1027]: INFO : Ignition finished successfully Feb 13 19:50:29.100088 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 19:50:29.100832 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 19:50:29.101101 systemd[1]: Stopped target network.target - Network. Feb 13 19:50:29.101230 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 19:50:29.101268 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 19:50:29.101450 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 19:50:29.101476 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 19:50:29.101650 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 19:50:29.101676 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 19:50:29.101815 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 19:50:29.101846 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 19:50:29.102278 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 19:50:29.102656 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 19:50:29.104307 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 19:50:29.105279 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 19:50:29.105578 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 19:50:29.107011 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 19:50:29.107398 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:50:29.108131 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 19:50:29.108357 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 19:50:29.108984 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 19:50:29.109127 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:50:29.113471 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 19:50:29.113580 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 19:50:29.113614 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:50:29.113758 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Feb 13 19:50:29.113784 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 19:50:29.113922 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 19:50:29.113947 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:50:29.114070 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 19:50:29.114094 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 19:50:29.114258 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:50:29.123625 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 19:50:29.123728 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:50:29.124193 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 19:50:29.124233 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 19:50:29.124510 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 19:50:29.124531 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:50:29.124726 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 19:50:29.124754 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:50:29.124909 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 19:50:29.124935 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 19:50:29.125070 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:50:29.125095 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:29.127560 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 19:50:29.128464 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 19:50:29.128495 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:50:29.128629 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:50:29.128653 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:29.128951 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 19:50:29.129002 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 19:50:29.133235 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 19:50:29.133444 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 19:50:29.210374 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 19:50:29.210471 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 19:50:29.210989 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 19:50:29.211141 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 19:50:29.211183 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 19:50:29.215499 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 19:50:29.227143 systemd[1]: Switching root. Feb 13 19:50:29.265607 systemd-journald[215]: Journal stopped Feb 13 19:50:24.735189 kernel: Linux version 6.6.74-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241116 p3) 14.2.1 20241116, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Thu Feb 13 17:41:03 -00 2025 Feb 13 19:50:24.735206 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=015d1d9e5e601f6a4e226c935072d3d0819e7eb2da20e68715973498f21aa3fe Feb 13 19:50:24.735212 kernel: Disabled fast string operations Feb 13 19:50:24.735217 kernel: BIOS-provided physical RAM map: Feb 13 19:50:24.735221 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Feb 13 19:50:24.735225 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Feb 13 19:50:24.735231 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Feb 13 19:50:24.735235 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Feb 13 19:50:24.735239 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Feb 13 19:50:24.735244 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Feb 13 19:50:24.735248 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Feb 13 19:50:24.735252 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Feb 13 19:50:24.735256 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Feb 13 19:50:24.735261 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Feb 13 19:50:24.735267 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Feb 13 19:50:24.735272 kernel: NX (Execute Disable) protection: active Feb 13 19:50:24.735277 kernel: APIC: Static calls initialized Feb 13 19:50:24.735282 kernel: SMBIOS 2.7 present. Feb 13 19:50:24.735287 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Feb 13 19:50:24.735291 kernel: vmware: hypercall mode: 0x00 Feb 13 19:50:24.735296 kernel: Hypervisor detected: VMware Feb 13 19:50:24.735301 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Feb 13 19:50:24.735306 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Feb 13 19:50:24.735311 kernel: vmware: using clock offset of 2469193083 ns Feb 13 19:50:24.735316 kernel: tsc: Detected 3408.000 MHz processor Feb 13 19:50:24.735321 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Feb 13 19:50:24.735326 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Feb 13 19:50:24.735331 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Feb 13 19:50:24.735336 kernel: total RAM covered: 3072M Feb 13 19:50:24.735341 kernel: Found optimal setting for mtrr clean up Feb 13 19:50:24.735347 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Feb 13 19:50:24.735352 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Feb 13 19:50:24.735358 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Feb 13 19:50:24.735362 kernel: Using GB pages for direct mapping Feb 13 19:50:24.735367 kernel: ACPI: Early table checksum verification disabled Feb 13 19:50:24.735372 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Feb 13 19:50:24.735377 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Feb 13 19:50:24.735382 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Feb 13 19:50:24.735387 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Feb 13 19:50:24.735392 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 19:50:24.735668 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Feb 13 19:50:24.735677 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Feb 13 19:50:24.735682 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Feb 13 19:50:24.735688 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Feb 13 19:50:24.735693 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Feb 13 19:50:24.735698 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Feb 13 19:50:24.735705 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Feb 13 19:50:24.735711 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Feb 13 19:50:24.735716 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Feb 13 19:50:24.735721 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 19:50:24.735726 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Feb 13 19:50:24.735731 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Feb 13 19:50:24.735736 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Feb 13 19:50:24.735742 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Feb 13 19:50:24.735747 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Feb 13 19:50:24.735753 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Feb 13 19:50:24.735759 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Feb 13 19:50:24.735764 kernel: system APIC only can use physical flat Feb 13 19:50:24.735769 kernel: APIC: Switched APIC routing to: physical flat Feb 13 19:50:24.735774 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Feb 13 19:50:24.735779 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Feb 13 19:50:24.735785 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Feb 13 19:50:24.735790 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Feb 13 19:50:24.735795 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Feb 13 19:50:24.735800 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Feb 13 19:50:24.735806 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Feb 13 19:50:24.735811 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Feb 13 19:50:24.735816 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Feb 13 19:50:24.735821 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Feb 13 19:50:24.735826 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Feb 13 19:50:24.735831 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Feb 13 19:50:24.735836 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Feb 13 19:50:24.735841 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Feb 13 19:50:24.735846 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Feb 13 19:50:24.735851 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Feb 13 19:50:24.735857 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Feb 13 19:50:24.735862 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Feb 13 19:50:24.735867 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Feb 13 19:50:24.735872 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Feb 13 19:50:24.735877 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Feb 13 19:50:24.735882 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Feb 13 19:50:24.735887 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Feb 13 19:50:24.735892 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Feb 13 19:50:24.735897 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Feb 13 19:50:24.735902 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Feb 13 19:50:24.735908 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Feb 13 19:50:24.735914 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Feb 13 19:50:24.735919 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Feb 13 19:50:24.735924 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Feb 13 19:50:24.735929 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Feb 13 19:50:24.735934 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Feb 13 19:50:24.735939 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Feb 13 19:50:24.735943 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Feb 13 19:50:24.735949 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Feb 13 19:50:24.735954 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Feb 13 19:50:24.735960 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Feb 13 19:50:24.735965 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Feb 13 19:50:24.735970 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Feb 13 19:50:24.735975 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Feb 13 19:50:24.735980 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Feb 13 19:50:24.735985 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Feb 13 19:50:24.735990 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Feb 13 19:50:24.735995 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Feb 13 19:50:24.736000 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Feb 13 19:50:24.736005 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Feb 13 19:50:24.736011 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Feb 13 19:50:24.736016 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Feb 13 19:50:24.736021 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Feb 13 19:50:24.736026 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Feb 13 19:50:24.736031 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Feb 13 19:50:24.736036 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Feb 13 19:50:24.736041 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Feb 13 19:50:24.736046 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Feb 13 19:50:24.736051 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Feb 13 19:50:24.736056 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Feb 13 19:50:24.736061 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Feb 13 19:50:24.736067 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Feb 13 19:50:24.736072 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Feb 13 19:50:24.736081 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Feb 13 19:50:24.736088 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Feb 13 19:50:24.736093 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Feb 13 19:50:24.736098 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Feb 13 19:50:24.736104 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Feb 13 19:50:24.736109 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Feb 13 19:50:24.736115 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Feb 13 19:50:24.736121 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Feb 13 19:50:24.736126 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Feb 13 19:50:24.736132 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Feb 13 19:50:24.736137 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Feb 13 19:50:24.736142 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Feb 13 19:50:24.736147 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Feb 13 19:50:24.736153 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Feb 13 19:50:24.736158 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Feb 13 19:50:24.736163 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Feb 13 19:50:24.736170 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Feb 13 19:50:24.736175 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Feb 13 19:50:24.736181 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Feb 13 19:50:24.736186 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Feb 13 19:50:24.736191 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Feb 13 19:50:24.736197 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Feb 13 19:50:24.736202 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Feb 13 19:50:24.736207 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Feb 13 19:50:24.736213 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Feb 13 19:50:24.736218 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Feb 13 19:50:24.736225 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Feb 13 19:50:24.736230 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Feb 13 19:50:24.736235 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Feb 13 19:50:24.736240 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Feb 13 19:50:24.736246 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Feb 13 19:50:24.736251 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Feb 13 19:50:24.736256 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Feb 13 19:50:24.736262 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Feb 13 19:50:24.736267 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Feb 13 19:50:24.736272 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Feb 13 19:50:24.736277 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Feb 13 19:50:24.736284 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Feb 13 19:50:24.736289 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Feb 13 19:50:24.736295 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Feb 13 19:50:24.736300 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Feb 13 19:50:24.736305 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Feb 13 19:50:24.736311 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Feb 13 19:50:24.736316 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Feb 13 19:50:24.736321 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Feb 13 19:50:24.736326 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Feb 13 19:50:24.736332 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Feb 13 19:50:24.736338 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Feb 13 19:50:24.736343 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Feb 13 19:50:24.736349 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Feb 13 19:50:24.736354 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Feb 13 19:50:24.736359 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Feb 13 19:50:24.736365 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Feb 13 19:50:24.736370 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Feb 13 19:50:24.736376 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Feb 13 19:50:24.736381 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Feb 13 19:50:24.736386 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Feb 13 19:50:24.736393 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Feb 13 19:50:24.736398 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Feb 13 19:50:24.736415 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Feb 13 19:50:24.736420 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Feb 13 19:50:24.736426 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Feb 13 19:50:24.736431 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Feb 13 19:50:24.736437 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Feb 13 19:50:24.736442 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Feb 13 19:50:24.736447 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Feb 13 19:50:24.736452 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Feb 13 19:50:24.736460 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Feb 13 19:50:24.736465 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Feb 13 19:50:24.736470 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Feb 13 19:50:24.736476 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Feb 13 19:50:24.736481 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Feb 13 19:50:24.736487 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Feb 13 19:50:24.736493 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Feb 13 19:50:24.736499 kernel: Zone ranges: Feb 13 19:50:24.736504 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Feb 13 19:50:24.736509 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Feb 13 19:50:24.736516 kernel: Normal empty Feb 13 19:50:24.736522 kernel: Movable zone start for each node Feb 13 19:50:24.736527 kernel: Early memory node ranges Feb 13 19:50:24.736532 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Feb 13 19:50:24.736538 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Feb 13 19:50:24.736543 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Feb 13 19:50:24.736549 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Feb 13 19:50:24.736555 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Feb 13 19:50:24.736560 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Feb 13 19:50:24.736567 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Feb 13 19:50:24.736572 kernel: ACPI: PM-Timer IO Port: 0x1008 Feb 13 19:50:24.736578 kernel: system APIC only can use physical flat Feb 13 19:50:24.736583 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Feb 13 19:50:24.736588 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Feb 13 19:50:24.736594 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Feb 13 19:50:24.736599 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Feb 13 19:50:24.736605 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Feb 13 19:50:24.736610 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Feb 13 19:50:24.736615 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Feb 13 19:50:24.736622 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Feb 13 19:50:24.736627 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Feb 13 19:50:24.736633 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Feb 13 19:50:24.736638 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Feb 13 19:50:24.736643 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Feb 13 19:50:24.736649 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Feb 13 19:50:24.736654 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Feb 13 19:50:24.736660 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Feb 13 19:50:24.736665 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Feb 13 19:50:24.736671 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Feb 13 19:50:24.736677 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Feb 13 19:50:24.736682 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Feb 13 19:50:24.736688 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Feb 13 19:50:24.736697 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Feb 13 19:50:24.736703 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Feb 13 19:50:24.736708 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Feb 13 19:50:24.736714 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Feb 13 19:50:24.736719 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Feb 13 19:50:24.736724 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Feb 13 19:50:24.736731 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Feb 13 19:50:24.736737 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Feb 13 19:50:24.736742 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Feb 13 19:50:24.736747 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Feb 13 19:50:24.736753 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Feb 13 19:50:24.736758 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Feb 13 19:50:24.736763 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Feb 13 19:50:24.736769 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Feb 13 19:50:24.736774 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Feb 13 19:50:24.736780 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Feb 13 19:50:24.736786 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Feb 13 19:50:24.736792 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Feb 13 19:50:24.736797 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Feb 13 19:50:24.736803 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Feb 13 19:50:24.736808 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Feb 13 19:50:24.736813 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Feb 13 19:50:24.736819 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Feb 13 19:50:24.736824 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Feb 13 19:50:24.736829 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Feb 13 19:50:24.736835 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Feb 13 19:50:24.736841 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Feb 13 19:50:24.736847 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Feb 13 19:50:24.736852 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Feb 13 19:50:24.736858 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Feb 13 19:50:24.736863 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Feb 13 19:50:24.736868 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Feb 13 19:50:24.736874 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Feb 13 19:50:24.736879 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Feb 13 19:50:24.736885 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Feb 13 19:50:24.736890 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Feb 13 19:50:24.736897 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Feb 13 19:50:24.736902 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Feb 13 19:50:24.736908 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Feb 13 19:50:24.736913 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Feb 13 19:50:24.736918 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Feb 13 19:50:24.736924 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Feb 13 19:50:24.736929 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Feb 13 19:50:24.736934 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Feb 13 19:50:24.736940 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Feb 13 19:50:24.736946 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Feb 13 19:50:24.736952 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Feb 13 19:50:24.736957 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Feb 13 19:50:24.736962 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Feb 13 19:50:24.736968 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Feb 13 19:50:24.736973 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Feb 13 19:50:24.736979 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Feb 13 19:50:24.736984 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Feb 13 19:50:24.736990 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Feb 13 19:50:24.736995 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Feb 13 19:50:24.737001 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Feb 13 19:50:24.737007 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Feb 13 19:50:24.737012 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Feb 13 19:50:24.737018 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Feb 13 19:50:24.737023 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Feb 13 19:50:24.737028 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Feb 13 19:50:24.737034 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Feb 13 19:50:24.737039 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Feb 13 19:50:24.737044 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Feb 13 19:50:24.737050 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Feb 13 19:50:24.737056 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Feb 13 19:50:24.737062 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Feb 13 19:50:24.737067 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Feb 13 19:50:24.737073 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Feb 13 19:50:24.737078 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Feb 13 19:50:24.737084 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Feb 13 19:50:24.737089 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Feb 13 19:50:24.737094 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Feb 13 19:50:24.737100 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Feb 13 19:50:24.737106 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Feb 13 19:50:24.737112 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Feb 13 19:50:24.737117 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Feb 13 19:50:24.737123 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Feb 13 19:50:24.737128 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Feb 13 19:50:24.737134 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Feb 13 19:50:24.737139 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Feb 13 19:50:24.737144 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Feb 13 19:50:24.737150 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Feb 13 19:50:24.737155 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Feb 13 19:50:24.737162 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Feb 13 19:50:24.737167 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Feb 13 19:50:24.737173 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Feb 13 19:50:24.737178 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Feb 13 19:50:24.737183 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Feb 13 19:50:24.737189 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Feb 13 19:50:24.737194 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Feb 13 19:50:24.737199 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Feb 13 19:50:24.737205 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Feb 13 19:50:24.737210 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Feb 13 19:50:24.737217 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Feb 13 19:50:24.737223 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Feb 13 19:50:24.737228 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Feb 13 19:50:24.737233 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Feb 13 19:50:24.737239 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Feb 13 19:50:24.737244 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Feb 13 19:50:24.737250 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Feb 13 19:50:24.737255 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Feb 13 19:50:24.737260 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Feb 13 19:50:24.737267 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Feb 13 19:50:24.737272 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Feb 13 19:50:24.737278 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Feb 13 19:50:24.737283 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Feb 13 19:50:24.737289 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Feb 13 19:50:24.737294 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Feb 13 19:50:24.737299 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Feb 13 19:50:24.737305 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Feb 13 19:50:24.737310 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Feb 13 19:50:24.737316 kernel: TSC deadline timer available Feb 13 19:50:24.737322 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Feb 13 19:50:24.737328 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Feb 13 19:50:24.737333 kernel: Booting paravirtualized kernel on VMware hypervisor Feb 13 19:50:24.737339 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Feb 13 19:50:24.737344 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Feb 13 19:50:24.737350 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Feb 13 19:50:24.737356 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Feb 13 19:50:24.737361 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Feb 13 19:50:24.737367 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Feb 13 19:50:24.737373 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Feb 13 19:50:24.737378 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Feb 13 19:50:24.737384 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Feb 13 19:50:24.737396 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Feb 13 19:50:24.737438 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Feb 13 19:50:24.737444 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Feb 13 19:50:24.737450 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Feb 13 19:50:24.737456 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Feb 13 19:50:24.737464 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Feb 13 19:50:24.737470 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Feb 13 19:50:24.737475 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Feb 13 19:50:24.737481 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Feb 13 19:50:24.737487 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Feb 13 19:50:24.737493 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Feb 13 19:50:24.737499 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=015d1d9e5e601f6a4e226c935072d3d0819e7eb2da20e68715973498f21aa3fe Feb 13 19:50:24.737505 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Feb 13 19:50:24.737512 kernel: random: crng init done Feb 13 19:50:24.737518 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Feb 13 19:50:24.737524 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Feb 13 19:50:24.737530 kernel: printk: log_buf_len min size: 262144 bytes Feb 13 19:50:24.737536 kernel: printk: log_buf_len: 1048576 bytes Feb 13 19:50:24.737542 kernel: printk: early log buf free: 239648(91%) Feb 13 19:50:24.737548 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Feb 13 19:50:24.737554 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Feb 13 19:50:24.737560 kernel: Fallback order for Node 0: 0 Feb 13 19:50:24.737571 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Feb 13 19:50:24.737577 kernel: Policy zone: DMA32 Feb 13 19:50:24.737582 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Feb 13 19:50:24.737589 kernel: Memory: 1934320K/2096628K available (14336K kernel code, 2301K rwdata, 22800K rodata, 43320K init, 1752K bss, 162048K reserved, 0K cma-reserved) Feb 13 19:50:24.737596 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Feb 13 19:50:24.737602 kernel: ftrace: allocating 37893 entries in 149 pages Feb 13 19:50:24.737609 kernel: ftrace: allocated 149 pages with 4 groups Feb 13 19:50:24.737615 kernel: Dynamic Preempt: voluntary Feb 13 19:50:24.737620 kernel: rcu: Preemptible hierarchical RCU implementation. Feb 13 19:50:24.737627 kernel: rcu: RCU event tracing is enabled. Feb 13 19:50:24.737634 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Feb 13 19:50:24.737640 kernel: Trampoline variant of Tasks RCU enabled. Feb 13 19:50:24.737646 kernel: Rude variant of Tasks RCU enabled. Feb 13 19:50:24.737652 kernel: Tracing variant of Tasks RCU enabled. Feb 13 19:50:24.737658 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Feb 13 19:50:24.737665 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Feb 13 19:50:24.737670 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Feb 13 19:50:24.737676 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Feb 13 19:50:24.737682 kernel: Console: colour VGA+ 80x25 Feb 13 19:50:24.737688 kernel: printk: console [tty0] enabled Feb 13 19:50:24.737694 kernel: printk: console [ttyS0] enabled Feb 13 19:50:24.737700 kernel: ACPI: Core revision 20230628 Feb 13 19:50:24.737705 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Feb 13 19:50:24.737711 kernel: APIC: Switch to symmetric I/O mode setup Feb 13 19:50:24.737717 kernel: x2apic enabled Feb 13 19:50:24.737724 kernel: APIC: Switched APIC routing to: physical x2apic Feb 13 19:50:24.737730 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Feb 13 19:50:24.737736 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 19:50:24.737742 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Feb 13 19:50:24.737748 kernel: Disabled fast string operations Feb 13 19:50:24.737754 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Feb 13 19:50:24.737759 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Feb 13 19:50:24.737765 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Feb 13 19:50:24.737771 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Feb 13 19:50:24.737778 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Feb 13 19:50:24.737784 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Feb 13 19:50:24.737790 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Feb 13 19:50:24.737796 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Feb 13 19:50:24.737802 kernel: RETBleed: Mitigation: Enhanced IBRS Feb 13 19:50:24.737808 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Feb 13 19:50:24.737814 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Feb 13 19:50:24.737820 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Feb 13 19:50:24.737827 kernel: SRBDS: Unknown: Dependent on hypervisor status Feb 13 19:50:24.737833 kernel: GDS: Unknown: Dependent on hypervisor status Feb 13 19:50:24.737838 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Feb 13 19:50:24.737844 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Feb 13 19:50:24.737850 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Feb 13 19:50:24.737856 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Feb 13 19:50:24.737862 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Feb 13 19:50:24.737868 kernel: Freeing SMP alternatives memory: 32K Feb 13 19:50:24.737873 kernel: pid_max: default: 131072 minimum: 1024 Feb 13 19:50:24.737880 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Feb 13 19:50:24.737886 kernel: landlock: Up and running. Feb 13 19:50:24.737892 kernel: SELinux: Initializing. Feb 13 19:50:24.737898 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.737904 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.737910 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Feb 13 19:50:24.737916 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:50:24.737922 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:50:24.737928 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Feb 13 19:50:24.737935 kernel: Performance Events: Skylake events, core PMU driver. Feb 13 19:50:24.737941 kernel: core: CPUID marked event: 'cpu cycles' unavailable Feb 13 19:50:24.737947 kernel: core: CPUID marked event: 'instructions' unavailable Feb 13 19:50:24.737952 kernel: core: CPUID marked event: 'bus cycles' unavailable Feb 13 19:50:24.737958 kernel: core: CPUID marked event: 'cache references' unavailable Feb 13 19:50:24.737964 kernel: core: CPUID marked event: 'cache misses' unavailable Feb 13 19:50:24.737969 kernel: core: CPUID marked event: 'branch instructions' unavailable Feb 13 19:50:24.737975 kernel: core: CPUID marked event: 'branch misses' unavailable Feb 13 19:50:24.737982 kernel: ... version: 1 Feb 13 19:50:24.737988 kernel: ... bit width: 48 Feb 13 19:50:24.737994 kernel: ... generic registers: 4 Feb 13 19:50:24.738000 kernel: ... value mask: 0000ffffffffffff Feb 13 19:50:24.738005 kernel: ... max period: 000000007fffffff Feb 13 19:50:24.738011 kernel: ... fixed-purpose events: 0 Feb 13 19:50:24.738017 kernel: ... event mask: 000000000000000f Feb 13 19:50:24.738023 kernel: signal: max sigframe size: 1776 Feb 13 19:50:24.738029 kernel: rcu: Hierarchical SRCU implementation. Feb 13 19:50:24.738036 kernel: rcu: Max phase no-delay instances is 400. Feb 13 19:50:24.738043 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Feb 13 19:50:24.738049 kernel: smp: Bringing up secondary CPUs ... Feb 13 19:50:24.738055 kernel: smpboot: x86: Booting SMP configuration: Feb 13 19:50:24.738061 kernel: .... node #0, CPUs: #1 Feb 13 19:50:24.738066 kernel: Disabled fast string operations Feb 13 19:50:24.738072 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Feb 13 19:50:24.738078 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Feb 13 19:50:24.738084 kernel: smp: Brought up 1 node, 2 CPUs Feb 13 19:50:24.738089 kernel: smpboot: Max logical packages: 128 Feb 13 19:50:24.738095 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Feb 13 19:50:24.738102 kernel: devtmpfs: initialized Feb 13 19:50:24.738108 kernel: x86/mm: Memory block size: 128MB Feb 13 19:50:24.738114 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Feb 13 19:50:24.738120 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Feb 13 19:50:24.738126 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Feb 13 19:50:24.738131 kernel: pinctrl core: initialized pinctrl subsystem Feb 13 19:50:24.738137 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Feb 13 19:50:24.738143 kernel: audit: initializing netlink subsys (disabled) Feb 13 19:50:24.738151 kernel: audit: type=2000 audit(1739476223.071:1): state=initialized audit_enabled=0 res=1 Feb 13 19:50:24.738156 kernel: thermal_sys: Registered thermal governor 'step_wise' Feb 13 19:50:24.738162 kernel: thermal_sys: Registered thermal governor 'user_space' Feb 13 19:50:24.738168 kernel: cpuidle: using governor menu Feb 13 19:50:24.738174 kernel: Simple Boot Flag at 0x36 set to 0x80 Feb 13 19:50:24.738180 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Feb 13 19:50:24.738185 kernel: dca service started, version 1.12.1 Feb 13 19:50:24.738191 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Feb 13 19:50:24.738197 kernel: PCI: Using configuration type 1 for base access Feb 13 19:50:24.738203 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Feb 13 19:50:24.738210 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Feb 13 19:50:24.738216 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Feb 13 19:50:24.738222 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Feb 13 19:50:24.738228 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Feb 13 19:50:24.738234 kernel: ACPI: Added _OSI(Module Device) Feb 13 19:50:24.738239 kernel: ACPI: Added _OSI(Processor Device) Feb 13 19:50:24.738245 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Feb 13 19:50:24.738251 kernel: ACPI: Added _OSI(Processor Aggregator Device) Feb 13 19:50:24.738257 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Feb 13 19:50:24.738264 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Feb 13 19:50:24.738270 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Feb 13 19:50:24.738276 kernel: ACPI: Interpreter enabled Feb 13 19:50:24.738282 kernel: ACPI: PM: (supports S0 S1 S5) Feb 13 19:50:24.738287 kernel: ACPI: Using IOAPIC for interrupt routing Feb 13 19:50:24.738293 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Feb 13 19:50:24.738299 kernel: PCI: Using E820 reservations for host bridge windows Feb 13 19:50:24.738305 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Feb 13 19:50:24.738312 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Feb 13 19:50:24.738393 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Feb 13 19:50:24.738463 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Feb 13 19:50:24.738513 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Feb 13 19:50:24.738521 kernel: PCI host bridge to bus 0000:00 Feb 13 19:50:24.738571 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Feb 13 19:50:24.738616 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Feb 13 19:50:24.738663 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Feb 13 19:50:24.738719 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Feb 13 19:50:24.738764 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Feb 13 19:50:24.738807 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Feb 13 19:50:24.738866 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Feb 13 19:50:24.738921 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Feb 13 19:50:24.738980 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Feb 13 19:50:24.739034 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Feb 13 19:50:24.739084 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Feb 13 19:50:24.739133 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Feb 13 19:50:24.739182 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Feb 13 19:50:24.739231 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Feb 13 19:50:24.739279 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Feb 13 19:50:24.739378 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Feb 13 19:50:24.739452 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Feb 13 19:50:24.739504 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Feb 13 19:50:24.739557 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Feb 13 19:50:24.739606 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Feb 13 19:50:24.739655 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Feb 13 19:50:24.739715 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Feb 13 19:50:24.739764 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Feb 13 19:50:24.739813 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Feb 13 19:50:24.739861 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Feb 13 19:50:24.739910 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Feb 13 19:50:24.739959 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Feb 13 19:50:24.740012 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Feb 13 19:50:24.740069 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740120 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740193 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740293 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740378 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740812 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740876 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.740929 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.740983 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.741035 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.741090 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.741141 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742481 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742543 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742601 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742654 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742710 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742761 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.742819 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.742873 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744439 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744505 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744563 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744616 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744674 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744730 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744784 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744834 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.744890 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.744942 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745000 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745051 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745105 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745155 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745208 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745259 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.745315 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.745365 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.747443 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.747508 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.747567 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748477 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748539 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748596 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748655 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748713 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748769 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748820 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748874 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.748928 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.748982 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749032 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749086 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749136 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749190 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749244 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749297 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749347 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749415 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749474 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749527 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749580 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749636 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Feb 13 19:50:24.749687 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.749741 kernel: pci_bus 0000:01: extended config space not accessible Feb 13 19:50:24.749793 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 19:50:24.749845 kernel: pci_bus 0000:02: extended config space not accessible Feb 13 19:50:24.749855 kernel: acpiphp: Slot [32] registered Feb 13 19:50:24.749863 kernel: acpiphp: Slot [33] registered Feb 13 19:50:24.749869 kernel: acpiphp: Slot [34] registered Feb 13 19:50:24.749875 kernel: acpiphp: Slot [35] registered Feb 13 19:50:24.749881 kernel: acpiphp: Slot [36] registered Feb 13 19:50:24.749887 kernel: acpiphp: Slot [37] registered Feb 13 19:50:24.749893 kernel: acpiphp: Slot [38] registered Feb 13 19:50:24.749899 kernel: acpiphp: Slot [39] registered Feb 13 19:50:24.749905 kernel: acpiphp: Slot [40] registered Feb 13 19:50:24.749911 kernel: acpiphp: Slot [41] registered Feb 13 19:50:24.749918 kernel: acpiphp: Slot [42] registered Feb 13 19:50:24.749924 kernel: acpiphp: Slot [43] registered Feb 13 19:50:24.749930 kernel: acpiphp: Slot [44] registered Feb 13 19:50:24.749936 kernel: acpiphp: Slot [45] registered Feb 13 19:50:24.749941 kernel: acpiphp: Slot [46] registered Feb 13 19:50:24.749947 kernel: acpiphp: Slot [47] registered Feb 13 19:50:24.749953 kernel: acpiphp: Slot [48] registered Feb 13 19:50:24.749959 kernel: acpiphp: Slot [49] registered Feb 13 19:50:24.749964 kernel: acpiphp: Slot [50] registered Feb 13 19:50:24.749970 kernel: acpiphp: Slot [51] registered Feb 13 19:50:24.749977 kernel: acpiphp: Slot [52] registered Feb 13 19:50:24.749983 kernel: acpiphp: Slot [53] registered Feb 13 19:50:24.749989 kernel: acpiphp: Slot [54] registered Feb 13 19:50:24.749995 kernel: acpiphp: Slot [55] registered Feb 13 19:50:24.750001 kernel: acpiphp: Slot [56] registered Feb 13 19:50:24.750007 kernel: acpiphp: Slot [57] registered Feb 13 19:50:24.750013 kernel: acpiphp: Slot [58] registered Feb 13 19:50:24.750018 kernel: acpiphp: Slot [59] registered Feb 13 19:50:24.750025 kernel: acpiphp: Slot [60] registered Feb 13 19:50:24.750032 kernel: acpiphp: Slot [61] registered Feb 13 19:50:24.750038 kernel: acpiphp: Slot [62] registered Feb 13 19:50:24.750044 kernel: acpiphp: Slot [63] registered Feb 13 19:50:24.750094 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Feb 13 19:50:24.750144 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 19:50:24.750193 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 19:50:24.750241 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:50:24.750290 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Feb 13 19:50:24.750341 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Feb 13 19:50:24.750390 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Feb 13 19:50:24.755478 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Feb 13 19:50:24.755541 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Feb 13 19:50:24.755602 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Feb 13 19:50:24.755656 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Feb 13 19:50:24.755708 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Feb 13 19:50:24.755763 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 19:50:24.755814 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Feb 13 19:50:24.755866 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 19:50:24.755919 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 19:50:24.755969 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 19:50:24.756019 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 19:50:24.756071 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 19:50:24.756121 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 19:50:24.756174 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 19:50:24.756223 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:50:24.756275 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 19:50:24.756325 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 19:50:24.756374 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 19:50:24.756437 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:50:24.756490 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 19:50:24.756544 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 19:50:24.756594 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:50:24.756645 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 19:50:24.756694 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 19:50:24.756744 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:50:24.756798 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 19:50:24.756848 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 19:50:24.756897 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:50:24.756949 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 19:50:24.756999 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 19:50:24.757048 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:50:24.757100 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 19:50:24.757150 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 19:50:24.757203 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:50:24.757261 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Feb 13 19:50:24.757313 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Feb 13 19:50:24.757364 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Feb 13 19:50:24.759476 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Feb 13 19:50:24.759554 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Feb 13 19:50:24.759610 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Feb 13 19:50:24.759668 kernel: pci 0000:0b:00.0: supports D1 D2 Feb 13 19:50:24.759720 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Feb 13 19:50:24.759772 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Feb 13 19:50:24.759825 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 19:50:24.759877 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 19:50:24.759927 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 19:50:24.759979 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 19:50:24.760030 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 19:50:24.760083 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 19:50:24.760133 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:50:24.760185 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 19:50:24.760236 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 19:50:24.760286 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 19:50:24.760335 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:50:24.760387 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 19:50:24.760503 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 19:50:24.760555 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:50:24.760607 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 19:50:24.760657 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 19:50:24.760709 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:50:24.760761 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 19:50:24.760810 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 19:50:24.760859 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:50:24.760914 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 19:50:24.760963 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 19:50:24.761012 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:50:24.761064 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 19:50:24.761113 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 19:50:24.761162 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:50:24.761213 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 19:50:24.761263 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 19:50:24.761312 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 19:50:24.761364 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:50:24.761435 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 19:50:24.761487 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 19:50:24.761537 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 19:50:24.761587 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:50:24.761639 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 19:50:24.761689 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 19:50:24.761747 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 19:50:24.761797 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:50:24.761848 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 19:50:24.761898 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 19:50:24.761948 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:50:24.761998 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 19:50:24.762048 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 19:50:24.762098 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:50:24.762152 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 19:50:24.762202 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 19:50:24.762252 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:50:24.762303 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 19:50:24.762353 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 19:50:24.762409 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:50:24.762479 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 19:50:24.762529 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 19:50:24.762581 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:50:24.762633 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 19:50:24.762683 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 19:50:24.762732 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 19:50:24.762782 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:50:24.762834 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 19:50:24.762883 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 19:50:24.762935 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 19:50:24.762984 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:50:24.763036 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 19:50:24.763086 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 19:50:24.763135 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:50:24.763187 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 19:50:24.763236 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 19:50:24.763285 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:50:24.763339 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 19:50:24.763388 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 19:50:24.763478 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:50:24.763529 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 19:50:24.763580 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 19:50:24.763629 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:50:24.763679 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 19:50:24.763728 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 19:50:24.763780 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:50:24.763831 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 19:50:24.763880 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 19:50:24.763929 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:50:24.763938 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Feb 13 19:50:24.763944 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Feb 13 19:50:24.763950 kernel: ACPI: PCI: Interrupt link LNKB disabled Feb 13 19:50:24.763956 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Feb 13 19:50:24.763962 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Feb 13 19:50:24.763971 kernel: iommu: Default domain type: Translated Feb 13 19:50:24.763977 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Feb 13 19:50:24.763983 kernel: PCI: Using ACPI for IRQ routing Feb 13 19:50:24.763989 kernel: PCI: pci_cache_line_size set to 64 bytes Feb 13 19:50:24.763995 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Feb 13 19:50:24.764001 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Feb 13 19:50:24.764051 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Feb 13 19:50:24.764100 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Feb 13 19:50:24.764149 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Feb 13 19:50:24.764160 kernel: vgaarb: loaded Feb 13 19:50:24.764166 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Feb 13 19:50:24.764172 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Feb 13 19:50:24.764178 kernel: clocksource: Switched to clocksource tsc-early Feb 13 19:50:24.764184 kernel: VFS: Disk quotas dquot_6.6.0 Feb 13 19:50:24.764191 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Feb 13 19:50:24.764197 kernel: pnp: PnP ACPI init Feb 13 19:50:24.764250 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Feb 13 19:50:24.764299 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Feb 13 19:50:24.764344 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Feb 13 19:50:24.764393 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Feb 13 19:50:24.764475 kernel: pnp 00:06: [dma 2] Feb 13 19:50:24.764525 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Feb 13 19:50:24.764570 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Feb 13 19:50:24.764617 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Feb 13 19:50:24.764626 kernel: pnp: PnP ACPI: found 8 devices Feb 13 19:50:24.764632 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Feb 13 19:50:24.764638 kernel: NET: Registered PF_INET protocol family Feb 13 19:50:24.764644 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Feb 13 19:50:24.764650 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Feb 13 19:50:24.764656 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Feb 13 19:50:24.764662 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Feb 13 19:50:24.764668 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Feb 13 19:50:24.764676 kernel: TCP: Hash tables configured (established 16384 bind 16384) Feb 13 19:50:24.764682 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.764688 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Feb 13 19:50:24.764697 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Feb 13 19:50:24.764703 kernel: NET: Registered PF_XDP protocol family Feb 13 19:50:24.764755 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Feb 13 19:50:24.764806 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Feb 13 19:50:24.764857 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Feb 13 19:50:24.764910 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Feb 13 19:50:24.764961 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Feb 13 19:50:24.765011 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Feb 13 19:50:24.765061 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Feb 13 19:50:24.765111 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Feb 13 19:50:24.765162 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Feb 13 19:50:24.765214 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Feb 13 19:50:24.765264 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Feb 13 19:50:24.765314 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Feb 13 19:50:24.765364 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Feb 13 19:50:24.765420 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Feb 13 19:50:24.765473 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Feb 13 19:50:24.765523 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Feb 13 19:50:24.765573 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Feb 13 19:50:24.765622 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Feb 13 19:50:24.765671 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Feb 13 19:50:24.765721 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Feb 13 19:50:24.765789 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Feb 13 19:50:24.765839 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Feb 13 19:50:24.765888 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Feb 13 19:50:24.765938 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:50:24.765987 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:50:24.766036 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766085 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766137 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766187 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766236 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766285 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766334 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766384 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766460 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766510 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766562 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766611 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766660 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766709 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766758 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766806 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766856 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.766906 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.766958 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767007 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767057 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767105 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767154 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767203 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767252 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767300 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767352 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767407 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767458 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767508 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767557 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767606 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767663 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767718 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767772 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767822 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767872 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.767922 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.767971 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768021 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768070 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768119 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768171 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768220 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768270 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768320 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768369 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768461 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768511 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768560 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768609 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768657 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768714 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768778 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768825 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768881 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.768929 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.768977 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769025 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769074 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769122 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769174 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769223 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769272 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769320 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769369 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769440 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769490 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769539 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769587 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769639 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769688 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769737 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769785 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769833 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769881 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.769929 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.769997 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770046 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.770095 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770147 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.770195 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770244 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Feb 13 19:50:24.770293 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Feb 13 19:50:24.770344 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Feb 13 19:50:24.770394 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Feb 13 19:50:24.770490 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Feb 13 19:50:24.770539 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Feb 13 19:50:24.770587 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:50:24.770643 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Feb 13 19:50:24.770727 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Feb 13 19:50:24.770791 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Feb 13 19:50:24.770839 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Feb 13 19:50:24.770887 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:50:24.770937 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Feb 13 19:50:24.770985 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Feb 13 19:50:24.771034 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Feb 13 19:50:24.771085 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:50:24.771135 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Feb 13 19:50:24.771184 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Feb 13 19:50:24.771233 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Feb 13 19:50:24.771282 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:50:24.771330 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Feb 13 19:50:24.771378 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Feb 13 19:50:24.771531 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:50:24.771582 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Feb 13 19:50:24.771630 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Feb 13 19:50:24.771682 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:50:24.771737 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Feb 13 19:50:24.771786 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Feb 13 19:50:24.771835 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:50:24.771882 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Feb 13 19:50:24.771933 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Feb 13 19:50:24.771984 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:50:24.772033 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Feb 13 19:50:24.772082 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Feb 13 19:50:24.772130 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:50:24.772182 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Feb 13 19:50:24.772232 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Feb 13 19:50:24.772281 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Feb 13 19:50:24.772331 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Feb 13 19:50:24.772380 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:50:24.772444 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Feb 13 19:50:24.772495 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Feb 13 19:50:24.772544 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Feb 13 19:50:24.772593 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:50:24.772642 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Feb 13 19:50:24.772691 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Feb 13 19:50:24.772769 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Feb 13 19:50:24.772819 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:50:24.772870 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Feb 13 19:50:24.772923 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Feb 13 19:50:24.772973 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:50:24.773022 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Feb 13 19:50:24.773072 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Feb 13 19:50:24.773122 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:50:24.773172 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Feb 13 19:50:24.773222 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Feb 13 19:50:24.773272 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:50:24.773322 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Feb 13 19:50:24.773371 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Feb 13 19:50:24.773431 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:50:24.773483 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Feb 13 19:50:24.773533 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Feb 13 19:50:24.773585 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:50:24.773636 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Feb 13 19:50:24.773687 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Feb 13 19:50:24.773738 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Feb 13 19:50:24.773800 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:50:24.773874 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Feb 13 19:50:24.773938 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Feb 13 19:50:24.774004 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Feb 13 19:50:24.774066 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:50:24.774136 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Feb 13 19:50:24.774197 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Feb 13 19:50:24.774259 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Feb 13 19:50:24.774311 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:50:24.774362 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Feb 13 19:50:24.774437 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Feb 13 19:50:24.774498 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:50:24.774553 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Feb 13 19:50:24.774603 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Feb 13 19:50:24.774653 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:50:24.774707 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Feb 13 19:50:24.774758 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Feb 13 19:50:24.774808 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:50:24.774859 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Feb 13 19:50:24.774908 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Feb 13 19:50:24.774958 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:50:24.775011 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Feb 13 19:50:24.775060 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Feb 13 19:50:24.775109 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:50:24.775160 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Feb 13 19:50:24.775220 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Feb 13 19:50:24.775290 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Feb 13 19:50:24.775356 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:50:24.775481 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Feb 13 19:50:24.775549 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Feb 13 19:50:24.775610 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Feb 13 19:50:24.775683 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:50:24.775742 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Feb 13 19:50:24.775792 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Feb 13 19:50:24.775842 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:50:24.775892 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Feb 13 19:50:24.775941 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Feb 13 19:50:24.775990 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:50:24.776040 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Feb 13 19:50:24.776089 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Feb 13 19:50:24.776142 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:50:24.776192 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Feb 13 19:50:24.776242 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Feb 13 19:50:24.776291 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:50:24.776341 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Feb 13 19:50:24.776390 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Feb 13 19:50:24.776484 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:50:24.776536 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Feb 13 19:50:24.776586 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Feb 13 19:50:24.776636 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:50:24.776688 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 19:50:24.776733 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 19:50:24.776785 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 19:50:24.776848 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Feb 13 19:50:24.776907 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Feb 13 19:50:24.776968 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Feb 13 19:50:24.777033 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Feb 13 19:50:24.777100 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Feb 13 19:50:24.777156 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Feb 13 19:50:24.777215 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Feb 13 19:50:24.777266 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Feb 13 19:50:24.777318 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Feb 13 19:50:24.777364 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Feb 13 19:50:24.778453 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Feb 13 19:50:24.778518 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Feb 13 19:50:24.778577 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Feb 13 19:50:24.778647 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Feb 13 19:50:24.778719 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Feb 13 19:50:24.778783 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Feb 13 19:50:24.778835 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Feb 13 19:50:24.778882 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Feb 13 19:50:24.778941 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Feb 13 19:50:24.779009 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Feb 13 19:50:24.779060 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Feb 13 19:50:24.779122 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Feb 13 19:50:24.779186 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Feb 13 19:50:24.779252 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Feb 13 19:50:24.779303 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Feb 13 19:50:24.779366 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Feb 13 19:50:24.780454 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Feb 13 19:50:24.780526 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Feb 13 19:50:24.780603 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Feb 13 19:50:24.780663 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Feb 13 19:50:24.780718 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Feb 13 19:50:24.780777 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Feb 13 19:50:24.780834 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Feb 13 19:50:24.780887 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Feb 13 19:50:24.780941 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Feb 13 19:50:24.780998 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Feb 13 19:50:24.781060 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Feb 13 19:50:24.781131 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Feb 13 19:50:24.781187 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Feb 13 19:50:24.781241 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Feb 13 19:50:24.781296 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Feb 13 19:50:24.781365 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Feb 13 19:50:24.781929 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Feb 13 19:50:24.781988 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Feb 13 19:50:24.782062 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Feb 13 19:50:24.782116 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Feb 13 19:50:24.782190 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Feb 13 19:50:24.782256 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Feb 13 19:50:24.782325 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Feb 13 19:50:24.782382 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Feb 13 19:50:24.783294 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Feb 13 19:50:24.783367 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Feb 13 19:50:24.783485 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Feb 13 19:50:24.783592 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Feb 13 19:50:24.783651 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Feb 13 19:50:24.783701 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Feb 13 19:50:24.783753 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Feb 13 19:50:24.783809 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Feb 13 19:50:24.783857 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Feb 13 19:50:24.783910 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Feb 13 19:50:24.783958 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Feb 13 19:50:24.784011 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Feb 13 19:50:24.784060 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Feb 13 19:50:24.784115 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Feb 13 19:50:24.784162 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Feb 13 19:50:24.784213 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Feb 13 19:50:24.784260 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Feb 13 19:50:24.784314 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Feb 13 19:50:24.784364 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Feb 13 19:50:24.784424 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Feb 13 19:50:24.784477 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Feb 13 19:50:24.784536 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Feb 13 19:50:24.784585 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Feb 13 19:50:24.784637 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Feb 13 19:50:24.784684 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Feb 13 19:50:24.784740 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Feb 13 19:50:24.784787 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Feb 13 19:50:24.784838 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Feb 13 19:50:24.784885 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Feb 13 19:50:24.784939 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Feb 13 19:50:24.784986 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Feb 13 19:50:24.785047 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Feb 13 19:50:24.785096 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Feb 13 19:50:24.785158 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Feb 13 19:50:24.785208 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Feb 13 19:50:24.785268 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Feb 13 19:50:24.785279 kernel: PCI: CLS 32 bytes, default 64 Feb 13 19:50:24.785286 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Feb 13 19:50:24.785296 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Feb 13 19:50:24.785302 kernel: clocksource: Switched to clocksource tsc Feb 13 19:50:24.785309 kernel: Initialise system trusted keyrings Feb 13 19:50:24.785316 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Feb 13 19:50:24.785322 kernel: Key type asymmetric registered Feb 13 19:50:24.785328 kernel: Asymmetric key parser 'x509' registered Feb 13 19:50:24.785336 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Feb 13 19:50:24.785343 kernel: io scheduler mq-deadline registered Feb 13 19:50:24.785349 kernel: io scheduler kyber registered Feb 13 19:50:24.785357 kernel: io scheduler bfq registered Feb 13 19:50:24.786974 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Feb 13 19:50:24.787042 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787100 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Feb 13 19:50:24.787153 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787208 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Feb 13 19:50:24.787284 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787352 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Feb 13 19:50:24.787424 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787492 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Feb 13 19:50:24.787555 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787610 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Feb 13 19:50:24.787690 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787773 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Feb 13 19:50:24.787838 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.787902 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Feb 13 19:50:24.787957 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788010 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Feb 13 19:50:24.788070 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788145 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Feb 13 19:50:24.788208 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788266 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Feb 13 19:50:24.788321 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788385 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Feb 13 19:50:24.788479 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788542 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Feb 13 19:50:24.788595 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788648 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Feb 13 19:50:24.788714 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788792 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Feb 13 19:50:24.788849 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.788907 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Feb 13 19:50:24.788961 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789019 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Feb 13 19:50:24.789082 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789161 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Feb 13 19:50:24.789216 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789273 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Feb 13 19:50:24.789333 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789573 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Feb 13 19:50:24.789632 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789703 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Feb 13 19:50:24.789762 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789832 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Feb 13 19:50:24.789886 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.789940 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Feb 13 19:50:24.790005 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790070 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Feb 13 19:50:24.790131 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790185 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Feb 13 19:50:24.790253 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790335 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Feb 13 19:50:24.790391 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790496 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Feb 13 19:50:24.790575 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790630 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Feb 13 19:50:24.790686 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790758 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Feb 13 19:50:24.790824 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.790877 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Feb 13 19:50:24.790936 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.791005 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Feb 13 19:50:24.791069 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.791127 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Feb 13 19:50:24.791186 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Feb 13 19:50:24.791198 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Feb 13 19:50:24.791206 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Feb 13 19:50:24.791212 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Feb 13 19:50:24.791219 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Feb 13 19:50:24.791225 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Feb 13 19:50:24.791232 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Feb 13 19:50:24.791285 kernel: rtc_cmos 00:01: registered as rtc0 Feb 13 19:50:24.791342 kernel: rtc_cmos 00:01: setting system clock to 2025-02-13T19:50:24 UTC (1739476224) Feb 13 19:50:24.791772 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Feb 13 19:50:24.791787 kernel: intel_pstate: CPU model not supported Feb 13 19:50:24.791799 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Feb 13 19:50:24.791811 kernel: NET: Registered PF_INET6 protocol family Feb 13 19:50:24.791823 kernel: Segment Routing with IPv6 Feb 13 19:50:24.791833 kernel: In-situ OAM (IOAM) with IPv6 Feb 13 19:50:24.791844 kernel: NET: Registered PF_PACKET protocol family Feb 13 19:50:24.791851 kernel: Key type dns_resolver registered Feb 13 19:50:24.791862 kernel: IPI shorthand broadcast: enabled Feb 13 19:50:24.791870 kernel: sched_clock: Marking stable (936180487, 233095616)->(1232030088, -62753985) Feb 13 19:50:24.791886 kernel: registered taskstats version 1 Feb 13 19:50:24.791897 kernel: Loading compiled-in X.509 certificates Feb 13 19:50:24.791908 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.74-flatcar: b3acedbed401b3cd9632ee9302ddcce254d8924d' Feb 13 19:50:24.791919 kernel: Key type .fscrypt registered Feb 13 19:50:24.791928 kernel: Key type fscrypt-provisioning registered Feb 13 19:50:24.791934 kernel: ima: No TPM chip found, activating TPM-bypass! Feb 13 19:50:24.791940 kernel: ima: Allocated hash algorithm: sha1 Feb 13 19:50:24.791951 kernel: ima: No architecture policies found Feb 13 19:50:24.791957 kernel: clk: Disabling unused clocks Feb 13 19:50:24.791964 kernel: Freeing unused kernel image (initmem) memory: 43320K Feb 13 19:50:24.791972 kernel: Write protecting the kernel read-only data: 38912k Feb 13 19:50:24.791979 kernel: Freeing unused kernel image (rodata/data gap) memory: 1776K Feb 13 19:50:24.791985 kernel: Run /init as init process Feb 13 19:50:24.791991 kernel: with arguments: Feb 13 19:50:24.792000 kernel: /init Feb 13 19:50:24.792007 kernel: with environment: Feb 13 19:50:24.792016 kernel: HOME=/ Feb 13 19:50:24.792022 kernel: TERM=linux Feb 13 19:50:24.792029 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Feb 13 19:50:24.792036 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 19:50:24.792045 systemd[1]: Detected virtualization vmware. Feb 13 19:50:24.792053 systemd[1]: Detected architecture x86-64. Feb 13 19:50:24.792061 systemd[1]: Running in initrd. Feb 13 19:50:24.792070 systemd[1]: No hostname configured, using default hostname. Feb 13 19:50:24.792079 systemd[1]: Hostname set to . Feb 13 19:50:24.792086 systemd[1]: Initializing machine ID from random generator. Feb 13 19:50:24.792093 systemd[1]: Queued start job for default target initrd.target. Feb 13 19:50:24.792100 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:50:24.792109 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:50:24.792117 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Feb 13 19:50:24.792123 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 19:50:24.792132 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Feb 13 19:50:24.792143 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Feb 13 19:50:24.792151 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Feb 13 19:50:24.792160 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Feb 13 19:50:24.792169 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:50:24.792177 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:50:24.792183 systemd[1]: Reached target paths.target - Path Units. Feb 13 19:50:24.792191 systemd[1]: Reached target slices.target - Slice Units. Feb 13 19:50:24.792204 systemd[1]: Reached target swap.target - Swaps. Feb 13 19:50:24.792213 systemd[1]: Reached target timers.target - Timer Units. Feb 13 19:50:24.792219 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:50:24.792226 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:50:24.792235 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Feb 13 19:50:24.792242 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Feb 13 19:50:24.792249 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:50:24.792255 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 19:50:24.792262 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:50:24.792271 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 19:50:24.792279 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Feb 13 19:50:24.792286 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 19:50:24.792294 systemd[1]: Finished network-cleanup.service - Network Cleanup. Feb 13 19:50:24.792301 systemd[1]: Starting systemd-fsck-usr.service... Feb 13 19:50:24.792307 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 19:50:24.792314 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 19:50:24.792321 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:50:24.792329 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Feb 13 19:50:24.792354 systemd-journald[215]: Collecting audit messages is disabled. Feb 13 19:50:24.792378 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:50:24.792389 systemd[1]: Finished systemd-fsck-usr.service. Feb 13 19:50:24.792513 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Feb 13 19:50:24.792523 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Feb 13 19:50:24.792531 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Feb 13 19:50:24.792537 kernel: Bridge firewalling registered Feb 13 19:50:24.792549 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 19:50:24.792556 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:24.792563 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:50:24.792570 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 19:50:24.792576 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 19:50:24.792585 systemd-journald[215]: Journal started Feb 13 19:50:24.792604 systemd-journald[215]: Runtime Journal (/run/log/journal/5b60a719a5654806b06355c06edc54b0) is 4.8M, max 38.6M, 33.8M free. Feb 13 19:50:24.740384 systemd-modules-load[216]: Inserted module 'overlay' Feb 13 19:50:24.794513 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 19:50:24.759534 systemd-modules-load[216]: Inserted module 'br_netfilter' Feb 13 19:50:24.793870 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:50:24.797540 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 19:50:24.798474 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:24.798925 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:50:24.802558 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Feb 13 19:50:24.804032 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:50:24.806662 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 19:50:24.813104 dracut-cmdline[249]: dracut-dracut-053 Feb 13 19:50:24.814516 dracut-cmdline[249]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=015d1d9e5e601f6a4e226c935072d3d0819e7eb2da20e68715973498f21aa3fe Feb 13 19:50:24.828255 systemd-resolved[251]: Positive Trust Anchors: Feb 13 19:50:24.828267 systemd-resolved[251]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 19:50:24.828290 systemd-resolved[251]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 19:50:24.830104 systemd-resolved[251]: Defaulting to hostname 'linux'. Feb 13 19:50:24.831042 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 19:50:24.831211 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:50:24.865413 kernel: SCSI subsystem initialized Feb 13 19:50:24.871430 kernel: Loading iSCSI transport class v2.0-870. Feb 13 19:50:24.878414 kernel: iscsi: registered transport (tcp) Feb 13 19:50:24.891715 kernel: iscsi: registered transport (qla4xxx) Feb 13 19:50:24.891758 kernel: QLogic iSCSI HBA Driver Feb 13 19:50:24.912940 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Feb 13 19:50:24.917505 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Feb 13 19:50:24.934245 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Feb 13 19:50:24.934301 kernel: device-mapper: uevent: version 1.0.3 Feb 13 19:50:24.934312 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Feb 13 19:50:24.967419 kernel: raid6: avx2x4 gen() 46415 MB/s Feb 13 19:50:24.982413 kernel: raid6: avx2x2 gen() 53081 MB/s Feb 13 19:50:24.999626 kernel: raid6: avx2x1 gen() 44507 MB/s Feb 13 19:50:24.999671 kernel: raid6: using algorithm avx2x2 gen() 53081 MB/s Feb 13 19:50:25.017659 kernel: raid6: .... xor() 32197 MB/s, rmw enabled Feb 13 19:50:25.017703 kernel: raid6: using avx2x2 recovery algorithm Feb 13 19:50:25.031429 kernel: xor: automatically using best checksumming function avx Feb 13 19:50:25.121420 kernel: Btrfs loaded, zoned=no, fsverity=no Feb 13 19:50:25.127180 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:50:25.133534 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:50:25.141467 systemd-udevd[434]: Using default interface naming scheme 'v255'. Feb 13 19:50:25.144346 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:50:25.150539 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Feb 13 19:50:25.157925 dracut-pre-trigger[440]: rd.md=0: removing MD RAID activation Feb 13 19:50:25.175167 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:50:25.179497 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 19:50:25.249964 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:50:25.254891 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Feb 13 19:50:25.265546 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Feb 13 19:50:25.265920 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:50:25.266215 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:50:25.266570 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 19:50:25.270483 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Feb 13 19:50:25.278743 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:50:25.315426 kernel: VMware PVSCSI driver - version 1.0.7.0-k Feb 13 19:50:25.322430 kernel: vmw_pvscsi: using 64bit dma Feb 13 19:50:25.324509 kernel: vmw_pvscsi: max_id: 16 Feb 13 19:50:25.324529 kernel: vmw_pvscsi: setting ring_pages to 8 Feb 13 19:50:25.332683 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Feb 13 19:50:25.332726 kernel: vmw_pvscsi: enabling reqCallThreshold Feb 13 19:50:25.332741 kernel: vmw_pvscsi: driver-based request coalescing enabled Feb 13 19:50:25.332753 kernel: vmw_pvscsi: using MSI-X Feb 13 19:50:25.334568 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Feb 13 19:50:25.336132 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Feb 13 19:50:25.352005 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Feb 13 19:50:25.352090 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Feb 13 19:50:25.352184 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Feb 13 19:50:25.352260 kernel: libata version 3.00 loaded. Feb 13 19:50:25.353421 kernel: cryptd: max_cpu_qlen set to 1000 Feb 13 19:50:25.357413 kernel: ata_piix 0000:00:07.1: version 2.13 Feb 13 19:50:25.373663 kernel: scsi host1: ata_piix Feb 13 19:50:25.373739 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Feb 13 19:50:25.373812 kernel: scsi host2: ata_piix Feb 13 19:50:25.373880 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Feb 13 19:50:25.373889 kernel: AVX2 version of gcm_enc/dec engaged. Feb 13 19:50:25.373897 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Feb 13 19:50:25.373904 kernel: AES CTR mode by8 optimization enabled Feb 13 19:50:25.359434 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:50:25.361531 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:25.361701 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:50:25.361790 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:50:25.361857 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:25.361957 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:50:25.369288 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:50:25.382494 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:25.397652 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Feb 13 19:50:25.409626 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:25.535483 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Feb 13 19:50:25.541416 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Feb 13 19:50:25.549571 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Feb 13 19:50:25.583752 kernel: sd 0:0:0:0: [sda] Write Protect is off Feb 13 19:50:25.583856 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Feb 13 19:50:25.583940 kernel: sd 0:0:0:0: [sda] Cache data unavailable Feb 13 19:50:25.584020 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Feb 13 19:50:25.584100 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:50:25.584112 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Feb 13 19:50:25.597961 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Feb 13 19:50:25.610365 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Feb 13 19:50:25.610380 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Feb 13 19:50:25.636418 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (485) Feb 13 19:50:25.642414 kernel: BTRFS: device fsid c7adc9b8-df7f-4a5f-93bf-204def2767a9 devid 1 transid 39 /dev/sda3 scanned by (udev-worker) (494) Feb 13 19:50:25.643111 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Feb 13 19:50:25.646636 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Feb 13 19:50:25.649876 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 19:50:25.652928 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Feb 13 19:50:25.653062 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Feb 13 19:50:25.657491 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Feb 13 19:50:25.721424 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:50:26.731432 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Feb 13 19:50:26.731471 disk-uuid[595]: The operation has completed successfully. Feb 13 19:50:26.817689 systemd[1]: disk-uuid.service: Deactivated successfully. Feb 13 19:50:26.818019 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Feb 13 19:50:26.824503 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Feb 13 19:50:26.826494 sh[612]: Success Feb 13 19:50:26.835417 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Feb 13 19:50:26.913891 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Feb 13 19:50:26.924250 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Feb 13 19:50:26.924644 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Feb 13 19:50:26.940649 kernel: BTRFS info (device dm-0): first mount of filesystem c7adc9b8-df7f-4a5f-93bf-204def2767a9 Feb 13 19:50:26.940682 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:26.940691 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Feb 13 19:50:26.942749 kernel: BTRFS info (device dm-0): disabling log replay at mount time Feb 13 19:50:26.942765 kernel: BTRFS info (device dm-0): using free space tree Feb 13 19:50:26.950417 kernel: BTRFS info (device dm-0): enabling ssd optimizations Feb 13 19:50:26.951865 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Feb 13 19:50:26.959508 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Feb 13 19:50:26.960701 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Feb 13 19:50:27.025302 kernel: BTRFS info (device sda6): first mount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.025351 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:27.025369 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:50:27.029424 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:50:27.039882 systemd[1]: mnt-oem.mount: Deactivated successfully. Feb 13 19:50:27.042449 kernel: BTRFS info (device sda6): last unmount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.045878 systemd[1]: Finished ignition-setup.service - Ignition (setup). Feb 13 19:50:27.049565 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Feb 13 19:50:27.073627 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 19:50:27.079509 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Feb 13 19:50:27.135851 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:50:27.143598 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 19:50:27.144752 ignition[673]: Ignition 2.20.0 Feb 13 19:50:27.144900 ignition[673]: Stage: fetch-offline Feb 13 19:50:27.144923 ignition[673]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.144928 ignition[673]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.144976 ignition[673]: parsed url from cmdline: "" Feb 13 19:50:27.144978 ignition[673]: no config URL provided Feb 13 19:50:27.144981 ignition[673]: reading system config file "/usr/lib/ignition/user.ign" Feb 13 19:50:27.144985 ignition[673]: no config at "/usr/lib/ignition/user.ign" Feb 13 19:50:27.145336 ignition[673]: config successfully fetched Feb 13 19:50:27.145353 ignition[673]: parsing config with SHA512: 47d845dd4d4313284ffd0570bdb128208c674c684235bc5d5814183181f8b6aa8252e2217ff10d57a42511567f1f2015461a33f65b692b3b86ddb4c670b7f41f Feb 13 19:50:27.148597 unknown[673]: fetched base config from "system" Feb 13 19:50:27.148724 unknown[673]: fetched user config from "vmware" Feb 13 19:50:27.149075 ignition[673]: fetch-offline: fetch-offline passed Feb 13 19:50:27.149226 ignition[673]: Ignition finished successfully Feb 13 19:50:27.150579 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:50:27.157223 systemd-networkd[805]: lo: Link UP Feb 13 19:50:27.157229 systemd-networkd[805]: lo: Gained carrier Feb 13 19:50:27.157973 systemd-networkd[805]: Enumeration completed Feb 13 19:50:27.158231 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 19:50:27.158264 systemd-networkd[805]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Feb 13 19:50:27.158512 systemd[1]: Reached target network.target - Network. Feb 13 19:50:27.162061 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 19:50:27.162222 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 19:50:27.158856 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Feb 13 19:50:27.161715 systemd-networkd[805]: ens192: Link UP Feb 13 19:50:27.161718 systemd-networkd[805]: ens192: Gained carrier Feb 13 19:50:27.164511 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Feb 13 19:50:27.173701 ignition[808]: Ignition 2.20.0 Feb 13 19:50:27.173709 ignition[808]: Stage: kargs Feb 13 19:50:27.173819 ignition[808]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.173825 ignition[808]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.174447 ignition[808]: kargs: kargs passed Feb 13 19:50:27.174476 ignition[808]: Ignition finished successfully Feb 13 19:50:27.175832 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Feb 13 19:50:27.180574 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Feb 13 19:50:27.187978 ignition[815]: Ignition 2.20.0 Feb 13 19:50:27.187985 ignition[815]: Stage: disks Feb 13 19:50:27.188513 ignition[815]: no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.188523 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.189471 ignition[815]: disks: disks passed Feb 13 19:50:27.189510 ignition[815]: Ignition finished successfully Feb 13 19:50:27.190033 systemd[1]: Finished ignition-disks.service - Ignition (disks). Feb 13 19:50:27.190438 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Feb 13 19:50:27.190577 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Feb 13 19:50:27.190783 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 19:50:27.190977 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 19:50:27.191151 systemd[1]: Reached target basic.target - Basic System. Feb 13 19:50:27.195483 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Feb 13 19:50:27.206150 systemd-fsck[823]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Feb 13 19:50:27.207673 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Feb 13 19:50:27.212614 systemd[1]: Mounting sysroot.mount - /sysroot... Feb 13 19:50:27.316995 systemd[1]: Mounted sysroot.mount - /sysroot. Feb 13 19:50:27.317423 kernel: EXT4-fs (sda9): mounted filesystem 7d46b70d-4c30-46e6-9935-e1f7fb523560 r/w with ordered data mode. Quota mode: none. Feb 13 19:50:27.317361 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Feb 13 19:50:27.321459 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:50:27.324357 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Feb 13 19:50:27.324752 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Feb 13 19:50:27.324787 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Feb 13 19:50:27.324807 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:50:27.329119 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Feb 13 19:50:27.330089 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Feb 13 19:50:27.378394 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (831) Feb 13 19:50:27.378456 kernel: BTRFS info (device sda6): first mount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.378477 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:27.379470 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:50:27.384751 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:50:27.385320 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:50:27.397925 initrd-setup-root[855]: cut: /sysroot/etc/passwd: No such file or directory Feb 13 19:50:27.400825 initrd-setup-root[862]: cut: /sysroot/etc/group: No such file or directory Feb 13 19:50:27.403136 initrd-setup-root[869]: cut: /sysroot/etc/shadow: No such file or directory Feb 13 19:50:27.405660 initrd-setup-root[876]: cut: /sysroot/etc/gshadow: No such file or directory Feb 13 19:50:27.480352 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Feb 13 19:50:27.486487 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Feb 13 19:50:27.488942 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Feb 13 19:50:27.493421 kernel: BTRFS info (device sda6): last unmount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.507141 ignition[944]: INFO : Ignition 2.20.0 Feb 13 19:50:27.508191 ignition[944]: INFO : Stage: mount Feb 13 19:50:27.508191 ignition[944]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:27.508191 ignition[944]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:27.508191 ignition[944]: INFO : mount: mount passed Feb 13 19:50:27.508191 ignition[944]: INFO : Ignition finished successfully Feb 13 19:50:27.509762 systemd[1]: Finished ignition-mount.service - Ignition (mount). Feb 13 19:50:27.510038 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Feb 13 19:50:27.514509 systemd[1]: Starting ignition-files.service - Ignition (files)... Feb 13 19:50:27.938824 systemd[1]: sysroot-oem.mount: Deactivated successfully. Feb 13 19:50:27.943502 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Feb 13 19:50:27.983419 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (956) Feb 13 19:50:27.986370 kernel: BTRFS info (device sda6): first mount of filesystem 60a376b4-1193-4e0b-af89-a0e6d698bf0f Feb 13 19:50:27.986417 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Feb 13 19:50:27.986432 kernel: BTRFS info (device sda6): using free space tree Feb 13 19:50:27.991419 kernel: BTRFS info (device sda6): enabling ssd optimizations Feb 13 19:50:27.992249 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Feb 13 19:50:28.011419 ignition[973]: INFO : Ignition 2.20.0 Feb 13 19:50:28.011419 ignition[973]: INFO : Stage: files Feb 13 19:50:28.011419 ignition[973]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:28.011419 ignition[973]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:28.012269 ignition[973]: DEBUG : files: compiled without relabeling support, skipping Feb 13 19:50:28.012432 ignition[973]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Feb 13 19:50:28.012432 ignition[973]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Feb 13 19:50:28.014549 ignition[973]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Feb 13 19:50:28.014692 ignition[973]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Feb 13 19:50:28.014845 unknown[973]: wrote ssh authorized keys file for user: core Feb 13 19:50:28.015052 ignition[973]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Feb 13 19:50:28.017213 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 19:50:28.017480 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Feb 13 19:50:28.074436 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Feb 13 19:50:28.191437 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Feb 13 19:50:28.191709 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.192949 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Feb 13 19:50:28.675049 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Feb 13 19:50:28.934250 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Feb 13 19:50:28.934250 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 19:50:28.934660 ignition[973]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): [started] processing unit "prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): op(d): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): op(d): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(c): [finished] processing unit "prepare-helm.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): [started] processing unit "coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): op(f): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): op(f): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(e): [finished] processing unit "coreos-metadata.service" Feb 13 19:50:28.934660 ignition[973]: INFO : files: op(10): [started] setting preset to disabled for "coreos-metadata.service" Feb 13 19:50:29.011534 systemd-networkd[805]: ens192: Gained IPv6LL Feb 13 19:50:29.012835 ignition[973]: INFO : files: op(10): op(11): [started] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(10): op(11): [finished] removing enablement symlink(s) for "coreos-metadata.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(10): [finished] setting preset to disabled for "coreos-metadata.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Feb 13 19:50:29.015295 ignition[973]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Feb 13 19:50:29.016606 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:50:29.016606 ignition[973]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Feb 13 19:50:29.016606 ignition[973]: INFO : files: files passed Feb 13 19:50:29.016606 ignition[973]: INFO : Ignition finished successfully Feb 13 19:50:29.016170 systemd[1]: Finished ignition-files.service - Ignition (files). Feb 13 19:50:29.023525 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Feb 13 19:50:29.025033 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Feb 13 19:50:29.032184 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:50:29.032184 initrd-setup-root-after-ignition[1002]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:50:29.032997 initrd-setup-root-after-ignition[1006]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Feb 13 19:50:29.033889 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:50:29.034229 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Feb 13 19:50:29.037491 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Feb 13 19:50:29.037699 systemd[1]: ignition-quench.service: Deactivated successfully. Feb 13 19:50:29.037744 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Feb 13 19:50:29.049214 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Feb 13 19:50:29.049264 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Feb 13 19:50:29.049813 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Feb 13 19:50:29.049913 systemd[1]: Reached target initrd.target - Initrd Default Target. Feb 13 19:50:29.050025 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Feb 13 19:50:29.050962 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Feb 13 19:50:29.059246 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:50:29.063520 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Feb 13 19:50:29.069005 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:50:29.069263 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:50:29.069457 systemd[1]: Stopped target timers.target - Timer Units. Feb 13 19:50:29.069595 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Feb 13 19:50:29.069663 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Feb 13 19:50:29.069869 systemd[1]: Stopped target initrd.target - Initrd Default Target. Feb 13 19:50:29.070084 systemd[1]: Stopped target basic.target - Basic System. Feb 13 19:50:29.070255 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Feb 13 19:50:29.070478 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Feb 13 19:50:29.070721 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Feb 13 19:50:29.070910 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Feb 13 19:50:29.071094 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Feb 13 19:50:29.071307 systemd[1]: Stopped target sysinit.target - System Initialization. Feb 13 19:50:29.071705 systemd[1]: Stopped target local-fs.target - Local File Systems. Feb 13 19:50:29.071911 systemd[1]: Stopped target swap.target - Swaps. Feb 13 19:50:29.072088 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Feb 13 19:50:29.072149 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Feb 13 19:50:29.072464 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:50:29.072617 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:50:29.072797 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Feb 13 19:50:29.072846 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:50:29.072983 systemd[1]: dracut-initqueue.service: Deactivated successfully. Feb 13 19:50:29.073044 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Feb 13 19:50:29.073298 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Feb 13 19:50:29.073371 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Feb 13 19:50:29.073637 systemd[1]: Stopped target paths.target - Path Units. Feb 13 19:50:29.073787 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Feb 13 19:50:29.078428 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:50:29.078612 systemd[1]: Stopped target slices.target - Slice Units. Feb 13 19:50:29.078840 systemd[1]: Stopped target sockets.target - Socket Units. Feb 13 19:50:29.078992 systemd[1]: iscsid.socket: Deactivated successfully. Feb 13 19:50:29.079069 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Feb 13 19:50:29.079317 systemd[1]: iscsiuio.socket: Deactivated successfully. Feb 13 19:50:29.079365 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Feb 13 19:50:29.079614 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Feb 13 19:50:29.079675 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Feb 13 19:50:29.079937 systemd[1]: ignition-files.service: Deactivated successfully. Feb 13 19:50:29.079998 systemd[1]: Stopped ignition-files.service - Ignition (files). Feb 13 19:50:29.085546 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Feb 13 19:50:29.085654 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Feb 13 19:50:29.085751 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:50:29.087533 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Feb 13 19:50:29.087649 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Feb 13 19:50:29.087743 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:50:29.087962 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Feb 13 19:50:29.088033 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Feb 13 19:50:29.090573 systemd[1]: initrd-cleanup.service: Deactivated successfully. Feb 13 19:50:29.090624 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Feb 13 19:50:29.097154 ignition[1027]: INFO : Ignition 2.20.0 Feb 13 19:50:29.097661 ignition[1027]: INFO : Stage: umount Feb 13 19:50:29.097990 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" Feb 13 19:50:29.098909 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Feb 13 19:50:29.098909 ignition[1027]: INFO : umount: umount passed Feb 13 19:50:29.098909 ignition[1027]: INFO : Ignition finished successfully Feb 13 19:50:29.100088 systemd[1]: ignition-mount.service: Deactivated successfully. Feb 13 19:50:29.100832 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Feb 13 19:50:29.101101 systemd[1]: Stopped target network.target - Network. Feb 13 19:50:29.101230 systemd[1]: ignition-disks.service: Deactivated successfully. Feb 13 19:50:29.101268 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Feb 13 19:50:29.101450 systemd[1]: ignition-kargs.service: Deactivated successfully. Feb 13 19:50:29.101476 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Feb 13 19:50:29.101650 systemd[1]: ignition-setup.service: Deactivated successfully. Feb 13 19:50:29.101676 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Feb 13 19:50:29.101815 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Feb 13 19:50:29.101846 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Feb 13 19:50:29.102278 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Feb 13 19:50:29.102656 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Feb 13 19:50:29.104307 systemd[1]: sysroot-boot.mount: Deactivated successfully. Feb 13 19:50:29.105279 systemd[1]: systemd-resolved.service: Deactivated successfully. Feb 13 19:50:29.105578 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Feb 13 19:50:29.107011 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Feb 13 19:50:29.107398 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:50:29.108131 systemd[1]: systemd-networkd.service: Deactivated successfully. Feb 13 19:50:29.108357 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Feb 13 19:50:29.108984 systemd[1]: systemd-networkd.socket: Deactivated successfully. Feb 13 19:50:29.109127 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:50:29.113471 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Feb 13 19:50:29.113580 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Feb 13 19:50:29.113614 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Feb 13 19:50:29.113758 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Feb 13 19:50:29.113784 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Feb 13 19:50:29.113922 systemd[1]: systemd-sysctl.service: Deactivated successfully. Feb 13 19:50:29.113947 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:50:29.114070 systemd[1]: systemd-modules-load.service: Deactivated successfully. Feb 13 19:50:29.114094 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Feb 13 19:50:29.114258 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:50:29.123625 systemd[1]: systemd-udevd.service: Deactivated successfully. Feb 13 19:50:29.123728 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:50:29.124193 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Feb 13 19:50:29.124233 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Feb 13 19:50:29.124510 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Feb 13 19:50:29.124531 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:50:29.124726 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Feb 13 19:50:29.124754 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Feb 13 19:50:29.124909 systemd[1]: dracut-cmdline.service: Deactivated successfully. Feb 13 19:50:29.124935 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Feb 13 19:50:29.125070 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Feb 13 19:50:29.125095 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Feb 13 19:50:29.127560 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Feb 13 19:50:29.128464 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Feb 13 19:50:29.128495 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:50:29.128629 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Feb 13 19:50:29.128653 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:29.128951 systemd[1]: network-cleanup.service: Deactivated successfully. Feb 13 19:50:29.129002 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Feb 13 19:50:29.133235 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Feb 13 19:50:29.133444 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Feb 13 19:50:29.210374 systemd[1]: sysroot-boot.service: Deactivated successfully. Feb 13 19:50:29.210471 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Feb 13 19:50:29.210989 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Feb 13 19:50:29.211141 systemd[1]: initrd-setup-root.service: Deactivated successfully. Feb 13 19:50:29.211183 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Feb 13 19:50:29.215499 systemd[1]: Starting initrd-switch-root.service - Switch Root... Feb 13 19:50:29.227143 systemd[1]: Switching root. Feb 13 19:50:29.265607 systemd-journald[215]: Journal stopped Feb 13 19:50:30.743124 systemd-journald[215]: Received SIGTERM from PID 1 (systemd). Feb 13 19:50:30.743153 kernel: SELinux: policy capability network_peer_controls=1 Feb 13 19:50:30.743166 kernel: SELinux: policy capability open_perms=1 Feb 13 19:50:30.743177 kernel: SELinux: policy capability extended_socket_class=1 Feb 13 19:50:30.743187 kernel: SELinux: policy capability always_check_network=0 Feb 13 19:50:30.743197 kernel: SELinux: policy capability cgroup_seclabel=1 Feb 13 19:50:30.743211 kernel: SELinux: policy capability nnp_nosuid_transition=1 Feb 13 19:50:30.743221 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Feb 13 19:50:30.743232 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Feb 13 19:50:30.743242 systemd[1]: Successfully loaded SELinux policy in 53.481ms. Feb 13 19:50:30.743254 kernel: audit: type=1403 audit(1739476229.961:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Feb 13 19:50:30.743266 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 8.787ms. Feb 13 19:50:30.743279 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Feb 13 19:50:30.743293 systemd[1]: Detected virtualization vmware. Feb 13 19:50:30.743306 systemd[1]: Detected architecture x86-64. Feb 13 19:50:30.743317 systemd[1]: Detected first boot. Feb 13 19:50:30.743329 systemd[1]: Initializing machine ID from random generator. Feb 13 19:50:30.743342 zram_generator::config[1069]: No configuration found. Feb 13 19:50:30.743355 systemd[1]: Populated /etc with preset unit settings. Feb 13 19:50:30.743368 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:50:30.743381 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Feb 13 19:50:30.743393 systemd[1]: initrd-switch-root.service: Deactivated successfully. Feb 13 19:50:30.743413 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Feb 13 19:50:30.743436 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Feb 13 19:50:30.743450 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Feb 13 19:50:30.743464 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Feb 13 19:50:30.743475 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Feb 13 19:50:30.743487 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Feb 13 19:50:30.743498 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Feb 13 19:50:30.743509 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Feb 13 19:50:30.743521 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Feb 13 19:50:30.743531 systemd[1]: Created slice user.slice - User and Session Slice. Feb 13 19:50:30.743544 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Feb 13 19:50:30.743557 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Feb 13 19:50:30.743569 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Feb 13 19:50:30.743581 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Feb 13 19:50:30.743593 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Feb 13 19:50:30.743605 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Feb 13 19:50:30.743617 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Feb 13 19:50:30.743630 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Feb 13 19:50:30.743645 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Feb 13 19:50:30.743659 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Feb 13 19:50:30.743672 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Feb 13 19:50:30.743684 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Feb 13 19:50:30.743696 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Feb 13 19:50:30.743709 systemd[1]: Reached target remote-fs.target - Remote File Systems. Feb 13 19:50:30.743721 systemd[1]: Reached target slices.target - Slice Units. Feb 13 19:50:30.743735 systemd[1]: Reached target swap.target - Swaps. Feb 13 19:50:30.743748 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Feb 13 19:50:30.743760 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Feb 13 19:50:30.743772 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Feb 13 19:50:30.743785 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Feb 13 19:50:30.743801 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Feb 13 19:50:30.743813 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Feb 13 19:50:30.743826 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Feb 13 19:50:30.743839 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Feb 13 19:50:30.743851 systemd[1]: Mounting media.mount - External Media Directory... Feb 13 19:50:30.743864 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:50:30.743877 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Feb 13 19:50:30.743890 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Feb 13 19:50:30.743906 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Feb 13 19:50:30.743920 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Feb 13 19:50:30.743933 systemd[1]: Reached target machines.target - Containers. Feb 13 19:50:30.743945 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Feb 13 19:50:30.743958 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Feb 13 19:50:30.743971 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Feb 13 19:50:30.743984 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Feb 13 19:50:30.743996 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 19:50:30.744007 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 19:50:30.744023 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 19:50:30.744036 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Feb 13 19:50:30.744047 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 19:50:30.744058 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Feb 13 19:50:30.744069 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Feb 13 19:50:30.744079 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Feb 13 19:50:30.744090 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Feb 13 19:50:30.744101 systemd[1]: Stopped systemd-fsck-usr.service. Feb 13 19:50:30.744114 systemd[1]: Starting systemd-journald.service - Journal Service... Feb 13 19:50:30.744126 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Feb 13 19:50:30.744137 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Feb 13 19:50:30.744149 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Feb 13 19:50:30.744161 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Feb 13 19:50:30.744172 systemd[1]: verity-setup.service: Deactivated successfully. Feb 13 19:50:30.744183 systemd[1]: Stopped verity-setup.service. Feb 13 19:50:30.744195 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:50:30.744209 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Feb 13 19:50:30.744221 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Feb 13 19:50:30.744232 systemd[1]: Mounted media.mount - External Media Directory. Feb 13 19:50:30.744243 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Feb 13 19:50:30.744254 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Feb 13 19:50:30.744266 kernel: fuse: init (API version 7.39) Feb 13 19:50:30.744276 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Feb 13 19:50:30.744288 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Feb 13 19:50:30.744300 systemd[1]: modprobe@configfs.service: Deactivated successfully. Feb 13 19:50:30.744313 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Feb 13 19:50:30.744324 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 19:50:30.744336 kernel: loop: module loaded Feb 13 19:50:30.744346 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 19:50:30.744358 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 19:50:30.744370 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 19:50:30.744395 systemd-journald[1156]: Collecting audit messages is disabled. Feb 13 19:50:30.745448 systemd[1]: modprobe@fuse.service: Deactivated successfully. Feb 13 19:50:30.745463 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Feb 13 19:50:30.745474 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 19:50:30.745485 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 19:50:30.745497 systemd-journald[1156]: Journal started Feb 13 19:50:30.745521 systemd-journald[1156]: Runtime Journal (/run/log/journal/9bdcc78876044d95b2434209d8bb4868) is 4.8M, max 38.6M, 33.8M free. Feb 13 19:50:30.526292 systemd[1]: Queued start job for default target multi-user.target. Feb 13 19:50:30.543015 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Feb 13 19:50:30.543227 systemd[1]: systemd-journald.service: Deactivated successfully. Feb 13 19:50:30.746190 jq[1136]: true Feb 13 19:50:30.747425 systemd[1]: Started systemd-journald.service - Journal Service. Feb 13 19:50:30.747959 jq[1168]: true Feb 13 19:50:30.748054 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Feb 13 19:50:30.748969 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Feb 13 19:50:30.749279 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Feb 13 19:50:30.767327 systemd[1]: Reached target network-pre.target - Preparation for Network. Feb 13 19:50:30.775526 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Feb 13 19:50:30.781223 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Feb 13 19:50:30.781376 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Feb 13 19:50:30.781447 systemd[1]: Reached target local-fs.target - Local File Systems. Feb 13 19:50:30.782526 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Feb 13 19:50:30.790493 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Feb 13 19:50:30.791518 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Feb 13 19:50:30.791668 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:50:30.794418 kernel: ACPI: bus type drm_connector registered Feb 13 19:50:30.806499 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Feb 13 19:50:30.815487 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Feb 13 19:50:30.815806 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 19:50:30.818661 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Feb 13 19:50:30.818784 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 19:50:30.820475 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Feb 13 19:50:30.823475 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Feb 13 19:50:30.824739 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Feb 13 19:50:30.825006 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 19:50:30.825092 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 19:50:30.825279 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Feb 13 19:50:30.825425 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Feb 13 19:50:30.826733 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Feb 13 19:50:30.839511 systemd[1]: Starting systemd-sysusers.service - Create System Users... Feb 13 19:50:30.849807 systemd-journald[1156]: Time spent on flushing to /var/log/journal/9bdcc78876044d95b2434209d8bb4868 is 23.664ms for 1832 entries. Feb 13 19:50:30.849807 systemd-journald[1156]: System Journal (/var/log/journal/9bdcc78876044d95b2434209d8bb4868) is 8.0M, max 584.8M, 576.8M free. Feb 13 19:50:30.886327 systemd-journald[1156]: Received client request to flush runtime journal. Feb 13 19:50:30.886357 kernel: loop0: detected capacity change from 0 to 138184 Feb 13 19:50:30.863653 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Feb 13 19:50:30.863950 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Feb 13 19:50:30.873499 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Feb 13 19:50:30.887326 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Feb 13 19:50:30.895426 ignition[1172]: Ignition 2.20.0 Feb 13 19:50:30.895610 ignition[1172]: deleting config from guestinfo properties Feb 13 19:50:30.923972 ignition[1172]: Successfully deleted config Feb 13 19:50:30.926344 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Feb 13 19:50:30.928250 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Feb 13 19:50:30.929152 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Feb 13 19:50:30.930617 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Feb 13 19:50:30.951136 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Feb 13 19:50:30.957077 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Feb 13 19:50:30.965417 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Feb 13 19:50:30.965429 udevadm[1229]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Feb 13 19:50:31.005419 kernel: loop1: detected capacity change from 0 to 2960 Feb 13 19:50:31.015990 systemd[1]: Finished systemd-sysusers.service - Create System Users. Feb 13 19:50:31.023505 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Feb 13 19:50:31.040417 kernel: loop2: detected capacity change from 0 to 210664 Feb 13 19:50:31.052337 systemd-tmpfiles[1231]: ACLs are not supported, ignoring. Feb 13 19:50:31.052350 systemd-tmpfiles[1231]: ACLs are not supported, ignoring. Feb 13 19:50:31.058604 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Feb 13 19:50:31.084419 kernel: loop3: detected capacity change from 0 to 141000 Feb 13 19:50:31.158513 kernel: loop4: detected capacity change from 0 to 138184 Feb 13 19:50:31.193423 kernel: loop5: detected capacity change from 0 to 2960 Feb 13 19:50:31.206483 kernel: loop6: detected capacity change from 0 to 210664 Feb 13 19:50:31.228553 kernel: loop7: detected capacity change from 0 to 141000 Feb 13 19:50:31.259755 (sd-merge)[1237]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Feb 13 19:50:31.260069 (sd-merge)[1237]: Merged extensions into '/usr'. Feb 13 19:50:31.275369 systemd[1]: Reloading requested from client PID 1205 ('systemd-sysext') (unit systemd-sysext.service)... Feb 13 19:50:31.275460 systemd[1]: Reloading... Feb 13 19:50:31.334878 zram_generator::config[1263]: No configuration found. Feb 13 19:50:31.427105 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:50:31.444538 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:50:31.476379 systemd[1]: Reloading finished in 200 ms. Feb 13 19:50:31.498891 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Feb 13 19:50:31.500758 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Feb 13 19:50:31.510589 systemd[1]: Starting ensure-sysext.service... Feb 13 19:50:31.513499 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Feb 13 19:50:31.516991 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Feb 13 19:50:31.518918 systemd[1]: Reloading requested from client PID 1319 ('systemctl') (unit ensure-sysext.service)... Feb 13 19:50:31.518926 systemd[1]: Reloading... Feb 13 19:50:31.536465 ldconfig[1197]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Feb 13 19:50:31.537522 systemd-tmpfiles[1320]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Feb 13 19:50:31.537857 systemd-tmpfiles[1320]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Feb 13 19:50:31.538463 systemd-tmpfiles[1320]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Feb 13 19:50:31.538685 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Feb 13 19:50:31.538766 systemd-tmpfiles[1320]: ACLs are not supported, ignoring. Feb 13 19:50:31.541239 systemd-tmpfiles[1320]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 19:50:31.541245 systemd-tmpfiles[1320]: Skipping /boot Feb 13 19:50:31.550670 systemd-udevd[1321]: Using default interface naming scheme 'v255'. Feb 13 19:50:31.554106 systemd-tmpfiles[1320]: Detected autofs mount point /boot during canonicalization of boot. Feb 13 19:50:31.554113 systemd-tmpfiles[1320]: Skipping /boot Feb 13 19:50:31.592426 zram_generator::config[1364]: No configuration found. Feb 13 19:50:31.670445 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1351) Feb 13 19:50:31.676460 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Feb 13 19:50:31.685418 kernel: ACPI: button: Power Button [PWRF] Feb 13 19:50:31.705183 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:50:31.725228 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:50:31.769429 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Feb 13 19:50:31.769896 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Feb 13 19:50:31.770123 systemd[1]: Reloading finished in 250 ms. Feb 13 19:50:31.780149 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Feb 13 19:50:31.782195 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Feb 13 19:50:31.782343 kernel: Guest personality initialized and is active Feb 13 19:50:31.782006 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Feb 13 19:50:31.782684 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Feb 13 19:50:31.784433 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Feb 13 19:50:31.784471 kernel: Initialized host personality Feb 13 19:50:31.785712 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Feb 13 19:50:31.807972 systemd[1]: Finished ensure-sysext.service. Feb 13 19:50:31.816455 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Feb 13 19:50:31.816860 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:50:31.820521 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 19:50:31.822927 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Feb 13 19:50:31.824496 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Feb 13 19:50:31.826014 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Feb 13 19:50:31.827492 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Feb 13 19:50:31.828534 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Feb 13 19:50:31.829549 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Feb 13 19:50:31.831502 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Feb 13 19:50:31.833784 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Feb 13 19:50:31.841580 systemd[1]: Starting systemd-networkd.service - Network Configuration... Feb 13 19:50:31.843483 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Feb 13 19:50:31.847504 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Feb 13 19:50:31.850834 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Feb 13 19:50:31.850977 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Feb 13 19:50:31.859520 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Feb 13 19:50:31.859765 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Feb 13 19:50:31.859870 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Feb 13 19:50:31.862747 systemd[1]: modprobe@drm.service: Deactivated successfully. Feb 13 19:50:31.862856 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Feb 13 19:50:31.869154 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Feb 13 19:50:31.874188 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Feb 13 19:50:31.884316 systemd[1]: modprobe@loop.service: Deactivated successfully. Feb 13 19:50:31.884450 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Feb 13 19:50:31.884716 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Feb 13 19:50:31.889514 kernel: mousedev: PS/2 mouse device common for all mice Feb 13 19:50:31.890724 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Feb 13 19:50:31.890849 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Feb 13 19:50:31.891081 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Feb 13 19:50:31.900529 (udev-worker)[1363]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Feb 13 19:50:31.901639 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Feb 13 19:50:31.911198 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Feb 13 19:50:31.916776 systemd[1]: Starting systemd-update-done.service - Update is Completed... Feb 13 19:50:31.918903 augenrules[1481]: No rules Feb 13 19:50:31.924100 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Feb 13 19:50:31.924384 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 19:50:31.924778 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 19:50:31.926799 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Feb 13 19:50:31.928904 systemd[1]: Started systemd-userdbd.service - User Database Manager. Feb 13 19:50:31.937459 systemd[1]: Finished systemd-update-done.service - Update is Completed. Feb 13 19:50:31.947377 lvm[1489]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 19:50:31.975047 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Feb 13 19:50:31.975284 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Feb 13 19:50:31.985023 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Feb 13 19:50:31.996481 lvm[1499]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Feb 13 19:50:32.001481 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Feb 13 19:50:32.001785 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Feb 13 19:50:32.009371 systemd-networkd[1445]: lo: Link UP Feb 13 19:50:32.009377 systemd-networkd[1445]: lo: Gained carrier Feb 13 19:50:32.013141 systemd-networkd[1445]: Enumeration completed Feb 13 19:50:32.013205 systemd[1]: Started systemd-networkd.service - Network Configuration. Feb 13 19:50:32.014666 systemd-networkd[1445]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Feb 13 19:50:32.014738 systemd-resolved[1446]: Positive Trust Anchors: Feb 13 19:50:32.015772 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Feb 13 19:50:32.015898 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Feb 13 19:50:32.014927 systemd-resolved[1446]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Feb 13 19:50:32.014954 systemd-resolved[1446]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Feb 13 19:50:32.017782 systemd-networkd[1445]: ens192: Link UP Feb 13 19:50:32.017872 systemd-networkd[1445]: ens192: Gained carrier Feb 13 19:50:32.019869 systemd-resolved[1446]: Defaulting to hostname 'linux'. Feb 13 19:50:32.026569 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Feb 13 19:50:32.026766 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Feb 13 19:50:32.027046 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Feb 13 19:50:32.027293 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Feb 13 19:50:32.028004 systemd[1]: Reached target network.target - Network. Feb 13 19:50:32.028573 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Feb 13 19:50:32.030653 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Feb 13 19:50:32.031514 systemd[1]: Reached target sysinit.target - System Initialization. Feb 13 19:50:32.031669 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Feb 13 19:50:32.031810 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Feb 13 19:50:32.031953 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Feb 13 19:50:32.032065 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Feb 13 19:50:32.032086 systemd[1]: Reached target paths.target - Path Units. Feb 13 19:50:32.032172 systemd[1]: Reached target time-set.target - System Time Set. Feb 13 19:50:32.032355 systemd[1]: Started logrotate.timer - Daily rotation of log files. Feb 13 19:50:32.032551 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Feb 13 19:50:32.032659 systemd[1]: Reached target timers.target - Timer Units. Feb 13 19:50:32.033586 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Feb 13 19:50:32.034662 systemd[1]: Starting docker.socket - Docker Socket for the API... Feb 13 19:50:32.044537 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Feb 13 19:50:32.045025 systemd[1]: Listening on docker.socket - Docker Socket for the API. Feb 13 19:50:32.045173 systemd[1]: Reached target sockets.target - Socket Units. Feb 13 19:50:32.045265 systemd[1]: Reached target basic.target - Basic System. Feb 13 19:50:32.045382 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Feb 13 19:50:32.045415 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Feb 13 19:50:32.046165 systemd[1]: Starting containerd.service - containerd container runtime... Feb 13 19:50:32.047490 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Feb 13 19:50:32.050007 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Feb 13 19:50:32.052545 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Feb 13 19:50:32.052657 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Feb 13 19:50:32.053390 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Feb 13 19:50:32.056470 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Feb 13 19:50:32.057011 jq[1511]: false Feb 13 19:50:32.057488 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Feb 13 19:50:32.058350 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Feb 13 19:50:32.061394 systemd[1]: Starting systemd-logind.service - User Login Management... Feb 13 19:50:32.062651 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Feb 13 19:50:32.063085 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Feb 13 19:50:32.063698 systemd[1]: Starting update-engine.service - Update Engine... Feb 13 19:50:32.066315 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Feb 13 19:50:32.068518 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Feb 13 19:50:32.069609 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Feb 13 19:50:32.069729 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Feb 13 19:50:32.080253 jq[1520]: true Feb 13 19:50:32.088817 systemd[1]: motdgen.service: Deactivated successfully. Feb 13 19:50:32.089449 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Feb 13 19:50:32.089753 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Feb 13 19:50:32.089978 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Feb 13 19:50:32.100502 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Feb 13 19:50:32.103528 update_engine[1519]: I20250213 19:50:32.103484 1519 main.cc:92] Flatcar Update Engine starting Feb 13 19:50:32.109483 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Feb 13 19:50:32.111465 extend-filesystems[1512]: Found loop4 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found loop5 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found loop6 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found loop7 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda1 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda2 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda3 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found usr Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda4 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda6 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda7 Feb 13 19:50:32.112054 extend-filesystems[1512]: Found sda9 Feb 13 19:50:32.112054 extend-filesystems[1512]: Checking size of /dev/sda9 Feb 13 19:50:32.116241 (ntainerd)[1536]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Feb 13 19:50:32.127651 dbus-daemon[1510]: [system] SELinux support is enabled Feb 13 19:50:32.127737 systemd[1]: Started dbus.service - D-Bus System Message Bus. Feb 13 19:50:32.134413 tar[1534]: linux-amd64/helm Feb 13 19:50:32.130152 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Feb 13 19:50:32.130169 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Feb 13 19:50:32.130309 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Feb 13 19:50:32.130319 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Feb 13 19:50:32.137969 jq[1535]: true Feb 13 19:50:32.139619 update_engine[1519]: I20250213 19:50:32.139518 1519 update_check_scheduler.cc:74] Next update check in 10m58s Feb 13 19:50:32.141215 extend-filesystems[1512]: Old size kept for /dev/sda9 Feb 13 19:50:32.141461 extend-filesystems[1512]: Found sr0 Feb 13 19:50:32.144756 systemd[1]: extend-filesystems.service: Deactivated successfully. Feb 13 19:50:32.145328 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Feb 13 19:50:32.149490 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Feb 13 19:50:32.151662 systemd[1]: Started update-engine.service - Update Engine. Feb 13 19:50:32.156584 systemd[1]: Started locksmithd.service - Cluster reboot manager. Feb 13 19:50:32.181600 unknown[1543]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Feb 13 19:50:32.184510 unknown[1543]: Core dump limit set to -1 Feb 13 19:50:32.199415 kernel: NET: Registered PF_VSOCK protocol family Feb 13 19:50:32.205157 systemd-logind[1517]: Watching system buttons on /dev/input/event1 (Power Button) Feb 13 19:50:32.205173 systemd-logind[1517]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Feb 13 19:50:32.205474 systemd-logind[1517]: New seat seat0. Feb 13 19:50:32.205843 systemd[1]: Started systemd-logind.service - User Login Management. Feb 13 19:50:32.214417 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (1359) Feb 13 19:50:32.291079 bash[1574]: Updated "/home/core/.ssh/authorized_keys" Feb 13 19:50:32.292431 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Feb 13 19:50:32.294635 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Feb 13 19:50:32.332417 locksmithd[1556]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Feb 13 19:51:46.396427 systemd-timesyncd[1447]: Contacted time server 216.164.157.216:123 (0.flatcar.pool.ntp.org). Feb 13 19:51:46.396464 systemd-timesyncd[1447]: Initial clock synchronization to Thu 2025-02-13 19:51:46.396340 UTC. Feb 13 19:51:46.396501 systemd-resolved[1446]: Clock change detected. Flushing caches. Feb 13 19:51:46.420442 sshd_keygen[1537]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Feb 13 19:51:46.463307 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Feb 13 19:51:46.470783 systemd[1]: Starting issuegen.service - Generate /run/issue... Feb 13 19:51:46.478699 systemd[1]: issuegen.service: Deactivated successfully. Feb 13 19:51:46.479504 systemd[1]: Finished issuegen.service - Generate /run/issue. Feb 13 19:51:46.484283 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Feb 13 19:51:46.486608 containerd[1536]: time="2025-02-13T19:51:46.486566101Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Feb 13 19:51:46.500358 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Feb 13 19:51:46.509468 systemd[1]: Started getty@tty1.service - Getty on tty1. Feb 13 19:51:46.512388 containerd[1536]: time="2025-02-13T19:51:46.512360445Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:51:46.513579 containerd[1536]: time="2025-02-13T19:51:46.513560603Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.74-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:51:46.513625 containerd[1536]: time="2025-02-13T19:51:46.513617109Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Feb 13 19:51:46.513659 containerd[1536]: time="2025-02-13T19:51:46.513652454Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Feb 13 19:51:46.513775 containerd[1536]: time="2025-02-13T19:51:46.513766421Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Feb 13 19:51:46.513823 containerd[1536]: time="2025-02-13T19:51:46.513815023Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Feb 13 19:51:46.513892 containerd[1536]: time="2025-02-13T19:51:46.513882078Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:51:46.513922 containerd[1536]: time="2025-02-13T19:51:46.513916300Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514054 containerd[1536]: time="2025-02-13T19:51:46.514044275Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514122 containerd[1536]: time="2025-02-13T19:51:46.514113658Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514164 containerd[1536]: time="2025-02-13T19:51:46.514155577Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514196 containerd[1536]: time="2025-02-13T19:51:46.514189789Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514265 containerd[1536]: time="2025-02-13T19:51:46.514256286Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514448 containerd[1536]: time="2025-02-13T19:51:46.514439543Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514561 containerd[1536]: time="2025-02-13T19:51:46.514550776Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Feb 13 19:51:46.514593 containerd[1536]: time="2025-02-13T19:51:46.514587157Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Feb 13 19:51:46.515682 containerd[1536]: time="2025-02-13T19:51:46.515670538Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Feb 13 19:51:46.515759 containerd[1536]: time="2025-02-13T19:51:46.515749760Z" level=info msg="metadata content store policy set" policy=shared Feb 13 19:51:46.519211 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Feb 13 19:51:46.519435 systemd[1]: Reached target getty.target - Login Prompts. Feb 13 19:51:46.523972 containerd[1536]: time="2025-02-13T19:51:46.523950438Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Feb 13 19:51:46.524127 containerd[1536]: time="2025-02-13T19:51:46.524116765Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Feb 13 19:51:46.524184 containerd[1536]: time="2025-02-13T19:51:46.524175725Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Feb 13 19:51:46.524229 containerd[1536]: time="2025-02-13T19:51:46.524221909Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Feb 13 19:51:46.524268 containerd[1536]: time="2025-02-13T19:51:46.524256628Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Feb 13 19:51:46.524483 containerd[1536]: time="2025-02-13T19:51:46.524470760Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Feb 13 19:51:46.524769 containerd[1536]: time="2025-02-13T19:51:46.524757815Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Feb 13 19:51:46.524874 containerd[1536]: time="2025-02-13T19:51:46.524864726Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Feb 13 19:51:46.524932 containerd[1536]: time="2025-02-13T19:51:46.524923528Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Feb 13 19:51:46.524993 containerd[1536]: time="2025-02-13T19:51:46.524974337Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Feb 13 19:51:46.525032 containerd[1536]: time="2025-02-13T19:51:46.525024708Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525068 containerd[1536]: time="2025-02-13T19:51:46.525061290Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525102 containerd[1536]: time="2025-02-13T19:51:46.525095580Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525145 containerd[1536]: time="2025-02-13T19:51:46.525136272Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525191 containerd[1536]: time="2025-02-13T19:51:46.525182640Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525235 containerd[1536]: time="2025-02-13T19:51:46.525228429Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525268 containerd[1536]: time="2025-02-13T19:51:46.525261972Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525317 containerd[1536]: time="2025-02-13T19:51:46.525309663Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Feb 13 19:51:46.525354 containerd[1536]: time="2025-02-13T19:51:46.525347982Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.525390 containerd[1536]: time="2025-02-13T19:51:46.525383835Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525423959Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525437475Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525445255Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525453241Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525459954Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525469097Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525480478Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525495567Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525505299Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525512293Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525519680Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525527905Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525540760Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525549613Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526001 containerd[1536]: time="2025-02-13T19:51:46.525555701Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525583035Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525595231Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525602726Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525609143Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525614325Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525620958Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525626775Z" level=info msg="NRI interface is disabled by configuration." Feb 13 19:51:46.526248 containerd[1536]: time="2025-02-13T19:51:46.525632900Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Feb 13 19:51:46.526363 containerd[1536]: time="2025-02-13T19:51:46.525803056Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:true] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:true SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Feb 13 19:51:46.526363 containerd[1536]: time="2025-02-13T19:51:46.525831229Z" level=info msg="Connect containerd service" Feb 13 19:51:46.526363 containerd[1536]: time="2025-02-13T19:51:46.525848975Z" level=info msg="using legacy CRI server" Feb 13 19:51:46.526363 containerd[1536]: time="2025-02-13T19:51:46.525853425Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Feb 13 19:51:46.526363 containerd[1536]: time="2025-02-13T19:51:46.525917772Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Feb 13 19:51:46.526836 containerd[1536]: time="2025-02-13T19:51:46.526820462Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Feb 13 19:51:46.527040 containerd[1536]: time="2025-02-13T19:51:46.527014325Z" level=info msg="Start subscribing containerd event" Feb 13 19:51:46.527069 containerd[1536]: time="2025-02-13T19:51:46.527048593Z" level=info msg="Start recovering state" Feb 13 19:51:46.527092 containerd[1536]: time="2025-02-13T19:51:46.527083531Z" level=info msg="Start event monitor" Feb 13 19:51:46.527111 containerd[1536]: time="2025-02-13T19:51:46.527092760Z" level=info msg="Start snapshots syncer" Feb 13 19:51:46.527111 containerd[1536]: time="2025-02-13T19:51:46.527098547Z" level=info msg="Start cni network conf syncer for default" Feb 13 19:51:46.527111 containerd[1536]: time="2025-02-13T19:51:46.527103937Z" level=info msg="Start streaming server" Feb 13 19:51:46.527161 containerd[1536]: time="2025-02-13T19:51:46.527087862Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Feb 13 19:51:46.527221 containerd[1536]: time="2025-02-13T19:51:46.527213052Z" level=info msg=serving... address=/run/containerd/containerd.sock Feb 13 19:51:46.527286 containerd[1536]: time="2025-02-13T19:51:46.527277346Z" level=info msg="containerd successfully booted in 0.042015s" Feb 13 19:51:46.527332 systemd[1]: Started containerd.service - containerd container runtime. Feb 13 19:51:46.661369 tar[1534]: linux-amd64/LICENSE Feb 13 19:51:46.661482 tar[1534]: linux-amd64/README.md Feb 13 19:51:46.669157 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Feb 13 19:51:47.362132 systemd-networkd[1445]: ens192: Gained IPv6LL Feb 13 19:51:47.363590 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Feb 13 19:51:47.364022 systemd[1]: Reached target network-online.target - Network is Online. Feb 13 19:51:47.368148 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Feb 13 19:51:47.369278 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:51:47.372022 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Feb 13 19:51:47.392933 systemd[1]: coreos-metadata.service: Deactivated successfully. Feb 13 19:51:47.393104 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Feb 13 19:51:47.394478 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Feb 13 19:51:47.395551 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Feb 13 19:51:48.024671 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:51:48.025027 systemd[1]: Reached target multi-user.target - Multi-User System. Feb 13 19:51:48.027032 systemd[1]: Startup finished in 1.018s (kernel) + 5.313s (initrd) + 4.055s (userspace) = 10.387s. Feb 13 19:51:48.028049 (kubelet)[1687]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:51:48.034053 agetty[1637]: failed to open credentials directory Feb 13 19:51:48.034754 agetty[1643]: failed to open credentials directory Feb 13 19:51:48.050758 login[1643]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 19:51:48.050925 login[1637]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Feb 13 19:51:48.056693 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Feb 13 19:51:48.061136 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Feb 13 19:51:48.063238 systemd-logind[1517]: New session 1 of user core. Feb 13 19:51:48.067221 systemd-logind[1517]: New session 2 of user core. Feb 13 19:51:48.071064 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Feb 13 19:51:48.077134 systemd[1]: Starting user@500.service - User Manager for UID 500... Feb 13 19:51:48.078941 (systemd)[1694]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Feb 13 19:51:48.135386 systemd[1694]: Queued start job for default target default.target. Feb 13 19:51:48.142939 systemd[1694]: Created slice app.slice - User Application Slice. Feb 13 19:51:48.142958 systemd[1694]: Reached target paths.target - Paths. Feb 13 19:51:48.142967 systemd[1694]: Reached target timers.target - Timers. Feb 13 19:51:48.143670 systemd[1694]: Starting dbus.socket - D-Bus User Message Bus Socket... Feb 13 19:51:48.152590 systemd[1694]: Listening on dbus.socket - D-Bus User Message Bus Socket. Feb 13 19:51:48.152624 systemd[1694]: Reached target sockets.target - Sockets. Feb 13 19:51:48.152634 systemd[1694]: Reached target basic.target - Basic System. Feb 13 19:51:48.152658 systemd[1694]: Reached target default.target - Main User Target. Feb 13 19:51:48.152678 systemd[1694]: Startup finished in 69ms. Feb 13 19:51:48.152838 systemd[1]: Started user@500.service - User Manager for UID 500. Feb 13 19:51:48.157297 systemd[1]: Started session-1.scope - Session 1 of User core. Feb 13 19:51:48.158482 systemd[1]: Started session-2.scope - Session 2 of User core. Feb 13 19:51:48.545662 kubelet[1687]: E0213 19:51:48.545629 1687 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:51:48.547035 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:51:48.547117 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:51:58.797460 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Feb 13 19:51:58.809161 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:51:59.146450 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:51:59.149536 (kubelet)[1737]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:51:59.241328 kubelet[1737]: E0213 19:51:59.241298 1737 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:51:59.244025 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:51:59.244117 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:52:09.494440 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Feb 13 19:52:09.505160 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:09.745618 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:09.748277 (kubelet)[1753]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:52:09.796213 kubelet[1753]: E0213 19:52:09.796169 1753 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:52:09.797746 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:52:09.797848 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:52:19.881711 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Feb 13 19:52:19.890123 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:19.997445 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:20.000508 (kubelet)[1768]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:52:20.030303 kubelet[1768]: E0213 19:52:20.030267 1768 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:52:20.031430 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:52:20.031510 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:52:26.427116 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Feb 13 19:52:26.437242 systemd[1]: Started sshd@0-139.178.70.104:22-147.75.109.163:45948.service - OpenSSH per-connection server daemon (147.75.109.163:45948). Feb 13 19:52:26.523259 sshd[1778]: Accepted publickey for core from 147.75.109.163 port 45948 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:52:26.524180 sshd-session[1778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:52:26.527692 systemd-logind[1517]: New session 3 of user core. Feb 13 19:52:26.537098 systemd[1]: Started session-3.scope - Session 3 of User core. Feb 13 19:52:26.598148 systemd[1]: Started sshd@1-139.178.70.104:22-147.75.109.163:45956.service - OpenSSH per-connection server daemon (147.75.109.163:45956). Feb 13 19:52:26.631635 sshd[1783]: Accepted publickey for core from 147.75.109.163 port 45956 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:52:26.632388 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:52:26.636118 systemd-logind[1517]: New session 4 of user core. Feb 13 19:52:26.642071 systemd[1]: Started session-4.scope - Session 4 of User core. Feb 13 19:52:26.693219 sshd[1785]: Connection closed by 147.75.109.163 port 45956 Feb 13 19:52:26.693512 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Feb 13 19:52:26.701898 systemd[1]: sshd@1-139.178.70.104:22-147.75.109.163:45956.service: Deactivated successfully. Feb 13 19:52:26.702934 systemd[1]: session-4.scope: Deactivated successfully. Feb 13 19:52:26.703964 systemd-logind[1517]: Session 4 logged out. Waiting for processes to exit. Feb 13 19:52:26.704699 systemd[1]: Started sshd@2-139.178.70.104:22-147.75.109.163:45958.service - OpenSSH per-connection server daemon (147.75.109.163:45958). Feb 13 19:52:26.706129 systemd-logind[1517]: Removed session 4. Feb 13 19:52:26.739128 sshd[1790]: Accepted publickey for core from 147.75.109.163 port 45958 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:52:26.739808 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:52:26.742438 systemd-logind[1517]: New session 5 of user core. Feb 13 19:52:26.749068 systemd[1]: Started session-5.scope - Session 5 of User core. Feb 13 19:52:26.795335 sshd[1792]: Connection closed by 147.75.109.163 port 45958 Feb 13 19:52:26.795064 sshd-session[1790]: pam_unix(sshd:session): session closed for user core Feb 13 19:52:26.811516 systemd[1]: sshd@2-139.178.70.104:22-147.75.109.163:45958.service: Deactivated successfully. Feb 13 19:52:26.812299 systemd[1]: session-5.scope: Deactivated successfully. Feb 13 19:52:26.813043 systemd-logind[1517]: Session 5 logged out. Waiting for processes to exit. Feb 13 19:52:26.813723 systemd[1]: Started sshd@3-139.178.70.104:22-147.75.109.163:45974.service - OpenSSH per-connection server daemon (147.75.109.163:45974). Feb 13 19:52:26.815178 systemd-logind[1517]: Removed session 5. Feb 13 19:52:26.848355 sshd[1797]: Accepted publickey for core from 147.75.109.163 port 45974 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:52:26.849126 sshd-session[1797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:52:26.851951 systemd-logind[1517]: New session 6 of user core. Feb 13 19:52:26.871094 systemd[1]: Started session-6.scope - Session 6 of User core. Feb 13 19:52:26.919102 sshd[1799]: Connection closed by 147.75.109.163 port 45974 Feb 13 19:52:26.919383 sshd-session[1797]: pam_unix(sshd:session): session closed for user core Feb 13 19:52:26.935465 systemd[1]: sshd@3-139.178.70.104:22-147.75.109.163:45974.service: Deactivated successfully. Feb 13 19:52:26.936260 systemd[1]: session-6.scope: Deactivated successfully. Feb 13 19:52:26.937029 systemd-logind[1517]: Session 6 logged out. Waiting for processes to exit. Feb 13 19:52:26.937696 systemd[1]: Started sshd@4-139.178.70.104:22-147.75.109.163:45990.service - OpenSSH per-connection server daemon (147.75.109.163:45990). Feb 13 19:52:26.939125 systemd-logind[1517]: Removed session 6. Feb 13 19:52:26.971897 sshd[1804]: Accepted publickey for core from 147.75.109.163 port 45990 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:52:26.972622 sshd-session[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:52:26.975806 systemd-logind[1517]: New session 7 of user core. Feb 13 19:52:26.982071 systemd[1]: Started session-7.scope - Session 7 of User core. Feb 13 19:52:27.036173 sudo[1807]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Feb 13 19:52:27.036337 sudo[1807]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:52:27.048495 sudo[1807]: pam_unix(sudo:session): session closed for user root Feb 13 19:52:27.049919 sshd[1806]: Connection closed by 147.75.109.163 port 45990 Feb 13 19:52:27.049929 sshd-session[1804]: pam_unix(sshd:session): session closed for user core Feb 13 19:52:27.057543 systemd[1]: sshd@4-139.178.70.104:22-147.75.109.163:45990.service: Deactivated successfully. Feb 13 19:52:27.058358 systemd[1]: session-7.scope: Deactivated successfully. Feb 13 19:52:27.059150 systemd-logind[1517]: Session 7 logged out. Waiting for processes to exit. Feb 13 19:52:27.059874 systemd[1]: Started sshd@5-139.178.70.104:22-147.75.109.163:45994.service - OpenSSH per-connection server daemon (147.75.109.163:45994). Feb 13 19:52:27.061237 systemd-logind[1517]: Removed session 7. Feb 13 19:52:27.094088 sshd[1812]: Accepted publickey for core from 147.75.109.163 port 45994 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:52:27.094791 sshd-session[1812]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:52:27.098022 systemd-logind[1517]: New session 8 of user core. Feb 13 19:52:27.107140 systemd[1]: Started session-8.scope - Session 8 of User core. Feb 13 19:52:27.154841 sudo[1816]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Feb 13 19:52:27.155025 sudo[1816]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:52:27.156834 sudo[1816]: pam_unix(sudo:session): session closed for user root Feb 13 19:52:27.159705 sudo[1815]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Feb 13 19:52:27.159850 sudo[1815]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:52:27.170258 systemd[1]: Starting audit-rules.service - Load Audit Rules... Feb 13 19:52:27.184864 augenrules[1838]: No rules Feb 13 19:52:27.185651 systemd[1]: audit-rules.service: Deactivated successfully. Feb 13 19:52:27.185773 systemd[1]: Finished audit-rules.service - Load Audit Rules. Feb 13 19:52:27.186469 sudo[1815]: pam_unix(sudo:session): session closed for user root Feb 13 19:52:27.187235 sshd[1814]: Connection closed by 147.75.109.163 port 45994 Feb 13 19:52:27.187451 sshd-session[1812]: pam_unix(sshd:session): session closed for user core Feb 13 19:52:27.197374 systemd[1]: sshd@5-139.178.70.104:22-147.75.109.163:45994.service: Deactivated successfully. Feb 13 19:52:27.198524 systemd[1]: session-8.scope: Deactivated successfully. Feb 13 19:52:27.199616 systemd-logind[1517]: Session 8 logged out. Waiting for processes to exit. Feb 13 19:52:27.200073 systemd[1]: Started sshd@6-139.178.70.104:22-147.75.109.163:45998.service - OpenSSH per-connection server daemon (147.75.109.163:45998). Feb 13 19:52:27.201422 systemd-logind[1517]: Removed session 8. Feb 13 19:52:27.233965 sshd[1846]: Accepted publickey for core from 147.75.109.163 port 45998 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:52:27.234230 sshd-session[1846]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:52:27.237314 systemd-logind[1517]: New session 9 of user core. Feb 13 19:52:27.243068 systemd[1]: Started session-9.scope - Session 9 of User core. Feb 13 19:52:27.290746 sudo[1849]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Feb 13 19:52:27.290906 sudo[1849]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Feb 13 19:52:27.559126 systemd[1]: Starting docker.service - Docker Application Container Engine... Feb 13 19:52:27.559202 (dockerd)[1866]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Feb 13 19:52:28.028260 dockerd[1866]: time="2025-02-13T19:52:28.028221940Z" level=info msg="Starting up" Feb 13 19:52:28.097892 dockerd[1866]: time="2025-02-13T19:52:28.097871225Z" level=info msg="Loading containers: start." Feb 13 19:52:28.218024 kernel: Initializing XFRM netlink socket Feb 13 19:52:28.276177 systemd-networkd[1445]: docker0: Link UP Feb 13 19:52:28.299043 dockerd[1866]: time="2025-02-13T19:52:28.298889331Z" level=info msg="Loading containers: done." Feb 13 19:52:28.310730 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3289471261-merged.mount: Deactivated successfully. Feb 13 19:52:28.312004 dockerd[1866]: time="2025-02-13T19:52:28.311783727Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Feb 13 19:52:28.312004 dockerd[1866]: time="2025-02-13T19:52:28.311835931Z" level=info msg="Docker daemon" commit=41ca978a0a5400cc24b274137efa9f25517fcc0b containerd-snapshotter=false storage-driver=overlay2 version=27.3.1 Feb 13 19:52:28.312004 dockerd[1866]: time="2025-02-13T19:52:28.311889350Z" level=info msg="Daemon has completed initialization" Feb 13 19:52:28.326156 dockerd[1866]: time="2025-02-13T19:52:28.325967356Z" level=info msg="API listen on /run/docker.sock" Feb 13 19:52:28.326022 systemd[1]: Started docker.service - Docker Application Container Engine. Feb 13 19:52:29.082635 containerd[1536]: time="2025-02-13T19:52:29.080909601Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.10\"" Feb 13 19:52:29.635199 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1758593229.mount: Deactivated successfully. Feb 13 19:52:30.131537 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Feb 13 19:52:30.141118 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:30.200084 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:30.202528 (kubelet)[2119]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:52:30.299040 kubelet[2119]: E0213 19:52:30.298994 2119 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:52:30.300337 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:52:30.300452 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:52:30.974732 containerd[1536]: time="2025-02-13T19:52:30.974696425Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:30.975480 containerd[1536]: time="2025-02-13T19:52:30.975244670Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.10: active requests=0, bytes read=32678214" Feb 13 19:52:30.975480 containerd[1536]: time="2025-02-13T19:52:30.975410476Z" level=info msg="ImageCreate event name:\"sha256:172a4e0b731db1008c5339e0b8ef232f5c55632099e37cccfb9ba786c19580c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:30.976904 containerd[1536]: time="2025-02-13T19:52:30.976882909Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:63b2b4b4e9b5dcb5b1b6cec9d5f5f538791a40cd8cb273ef530e6d6535aa0b43\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:30.977644 containerd[1536]: time="2025-02-13T19:52:30.977503462Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.10\" with image id \"sha256:172a4e0b731db1008c5339e0b8ef232f5c55632099e37cccfb9ba786c19580c4\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:63b2b4b4e9b5dcb5b1b6cec9d5f5f538791a40cd8cb273ef530e6d6535aa0b43\", size \"32675014\" in 1.896568846s" Feb 13 19:52:30.977644 containerd[1536]: time="2025-02-13T19:52:30.977522546Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.10\" returns image reference \"sha256:172a4e0b731db1008c5339e0b8ef232f5c55632099e37cccfb9ba786c19580c4\"" Feb 13 19:52:30.990083 containerd[1536]: time="2025-02-13T19:52:30.990048399Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.10\"" Feb 13 19:52:31.644001 update_engine[1519]: I20250213 19:52:31.643014 1519 update_attempter.cc:509] Updating boot flags... Feb 13 19:52:31.679010 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2144) Feb 13 19:52:31.924004 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2143) Feb 13 19:52:32.038999 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 39 scanned by (udev-worker) (2143) Feb 13 19:52:32.481510 containerd[1536]: time="2025-02-13T19:52:32.481482120Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:32.488684 containerd[1536]: time="2025-02-13T19:52:32.488653610Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.10: active requests=0, bytes read=29611545" Feb 13 19:52:32.494264 containerd[1536]: time="2025-02-13T19:52:32.494243811Z" level=info msg="ImageCreate event name:\"sha256:f81ad4d47d77570472cf20a1f6b008ece135be405b2f52f50ed6820f2b6f9a5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:32.505135 containerd[1536]: time="2025-02-13T19:52:32.505104478Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:99b3336343ea48be24f1e64774825e9f8d5170bd2ed482ff336548eb824f5f58\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:32.506449 containerd[1536]: time="2025-02-13T19:52:32.506423313Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.10\" with image id \"sha256:f81ad4d47d77570472cf20a1f6b008ece135be405b2f52f50ed6820f2b6f9a5f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:99b3336343ea48be24f1e64774825e9f8d5170bd2ed482ff336548eb824f5f58\", size \"31058091\" in 1.516339459s" Feb 13 19:52:32.506484 containerd[1536]: time="2025-02-13T19:52:32.506448414Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.10\" returns image reference \"sha256:f81ad4d47d77570472cf20a1f6b008ece135be405b2f52f50ed6820f2b6f9a5f\"" Feb 13 19:52:32.524164 containerd[1536]: time="2025-02-13T19:52:32.524068768Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.10\"" Feb 13 19:52:33.616761 containerd[1536]: time="2025-02-13T19:52:33.616700608Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:33.617333 containerd[1536]: time="2025-02-13T19:52:33.617280142Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.10: active requests=0, bytes read=17782130" Feb 13 19:52:33.617944 containerd[1536]: time="2025-02-13T19:52:33.617527374Z" level=info msg="ImageCreate event name:\"sha256:64edffde4bf75617ad8fc73556d5e80d34b9425c79106b7f74b2059243b2ffe8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:33.619200 containerd[1536]: time="2025-02-13T19:52:33.619171246Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:cf7eb256192f1f51093fe278c209a9368f0675eb61ed01b148af47d2f21c002d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:33.619760 containerd[1536]: time="2025-02-13T19:52:33.619745043Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.10\" with image id \"sha256:64edffde4bf75617ad8fc73556d5e80d34b9425c79106b7f74b2059243b2ffe8\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:cf7eb256192f1f51093fe278c209a9368f0675eb61ed01b148af47d2f21c002d\", size \"19228694\" in 1.095653533s" Feb 13 19:52:33.619789 containerd[1536]: time="2025-02-13T19:52:33.619762192Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.10\" returns image reference \"sha256:64edffde4bf75617ad8fc73556d5e80d34b9425c79106b7f74b2059243b2ffe8\"" Feb 13 19:52:33.635734 containerd[1536]: time="2025-02-13T19:52:33.635705878Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\"" Feb 13 19:52:34.855901 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1785237792.mount: Deactivated successfully. Feb 13 19:52:35.150412 containerd[1536]: time="2025-02-13T19:52:35.150006324Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:35.152634 containerd[1536]: time="2025-02-13T19:52:35.152612482Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.10: active requests=0, bytes read=29057858" Feb 13 19:52:35.163786 containerd[1536]: time="2025-02-13T19:52:35.163743304Z" level=info msg="ImageCreate event name:\"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:35.171216 containerd[1536]: time="2025-02-13T19:52:35.171186527Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:35.171719 containerd[1536]: time="2025-02-13T19:52:35.171593148Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.10\" with image id \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\", repo tag \"registry.k8s.io/kube-proxy:v1.30.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:d112e804e548fce28d9f1e3282c9ce54e374451e6a2c41b1ca9d7fca5d1fcc48\", size \"29056877\" in 1.535861319s" Feb 13 19:52:35.171719 containerd[1536]: time="2025-02-13T19:52:35.171617309Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.10\" returns image reference \"sha256:a21d1b47e857207628486a387f670f224051a16b74b06a1b76d07a96e738ab54\"" Feb 13 19:52:35.186906 containerd[1536]: time="2025-02-13T19:52:35.186875918Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Feb 13 19:52:35.974187 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1957947837.mount: Deactivated successfully. Feb 13 19:52:36.677188 containerd[1536]: time="2025-02-13T19:52:36.677118561Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:36.682589 containerd[1536]: time="2025-02-13T19:52:36.682426402Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Feb 13 19:52:36.690631 containerd[1536]: time="2025-02-13T19:52:36.690588395Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:36.696127 containerd[1536]: time="2025-02-13T19:52:36.696089793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:36.696853 containerd[1536]: time="2025-02-13T19:52:36.696668523Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 1.509765753s" Feb 13 19:52:36.696853 containerd[1536]: time="2025-02-13T19:52:36.696687820Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Feb 13 19:52:36.712332 containerd[1536]: time="2025-02-13T19:52:36.712300130Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Feb 13 19:52:37.317758 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2383356747.mount: Deactivated successfully. Feb 13 19:52:37.319933 containerd[1536]: time="2025-02-13T19:52:37.319514470Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:37.320335 containerd[1536]: time="2025-02-13T19:52:37.320309976Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Feb 13 19:52:37.320660 containerd[1536]: time="2025-02-13T19:52:37.320648484Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:37.321860 containerd[1536]: time="2025-02-13T19:52:37.321845444Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:37.322504 containerd[1536]: time="2025-02-13T19:52:37.322491679Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 610.152475ms" Feb 13 19:52:37.322563 containerd[1536]: time="2025-02-13T19:52:37.322554318Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Feb 13 19:52:37.336275 containerd[1536]: time="2025-02-13T19:52:37.336255472Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Feb 13 19:52:37.777492 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2089089534.mount: Deactivated successfully. Feb 13 19:52:40.381713 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Feb 13 19:52:40.387150 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:40.589439 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:40.592847 (kubelet)[2291]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Feb 13 19:52:40.754518 kubelet[2291]: E0213 19:52:40.754392 2291 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Feb 13 19:52:40.755638 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Feb 13 19:52:40.755724 systemd[1]: kubelet.service: Failed with result 'exit-code'. Feb 13 19:52:42.771313 containerd[1536]: time="2025-02-13T19:52:42.771255552Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:42.792978 containerd[1536]: time="2025-02-13T19:52:42.792926087Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238571" Feb 13 19:52:42.811953 containerd[1536]: time="2025-02-13T19:52:42.810726045Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:42.871644 containerd[1536]: time="2025-02-13T19:52:42.871569288Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:52:42.872523 containerd[1536]: time="2025-02-13T19:52:42.872499509Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 5.536125208s" Feb 13 19:52:42.872569 containerd[1536]: time="2025-02-13T19:52:42.872530481Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Feb 13 19:52:45.324506 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:45.340130 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:45.354844 systemd[1]: Reloading requested from client PID 2370 ('systemctl') (unit session-9.scope)... Feb 13 19:52:45.354862 systemd[1]: Reloading... Feb 13 19:52:45.428009 zram_generator::config[2407]: No configuration found. Feb 13 19:52:45.472924 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:52:45.489293 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:52:45.534662 systemd[1]: Reloading finished in 179 ms. Feb 13 19:52:45.572645 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Feb 13 19:52:45.572687 systemd[1]: kubelet.service: Failed with result 'signal'. Feb 13 19:52:45.573002 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:45.578151 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:45.862791 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:45.866402 (kubelet)[2475]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 19:52:45.929424 kubelet[2475]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:52:45.929424 kubelet[2475]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 19:52:45.929424 kubelet[2475]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:52:45.941311 kubelet[2475]: I0213 19:52:45.941283 2475 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 19:52:46.366880 kubelet[2475]: I0213 19:52:46.366857 2475 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Feb 13 19:52:46.367244 kubelet[2475]: I0213 19:52:46.366970 2475 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 19:52:46.367244 kubelet[2475]: I0213 19:52:46.367117 2475 server.go:927] "Client rotation is on, will bootstrap in background" Feb 13 19:52:46.446030 kubelet[2475]: E0213 19:52:46.445671 2475 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.104:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.446030 kubelet[2475]: I0213 19:52:46.445901 2475 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 19:52:46.458672 kubelet[2475]: I0213 19:52:46.458591 2475 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 19:52:46.458781 kubelet[2475]: I0213 19:52:46.458749 2475 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 19:52:46.460004 kubelet[2475]: I0213 19:52:46.458783 2475 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Feb 13 19:52:46.460605 kubelet[2475]: I0213 19:52:46.460589 2475 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 19:52:46.460643 kubelet[2475]: I0213 19:52:46.460606 2475 container_manager_linux.go:301] "Creating device plugin manager" Feb 13 19:52:46.460708 kubelet[2475]: I0213 19:52:46.460693 2475 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:52:46.461367 kubelet[2475]: I0213 19:52:46.461351 2475 kubelet.go:400] "Attempting to sync node with API server" Feb 13 19:52:46.461367 kubelet[2475]: I0213 19:52:46.461364 2475 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 19:52:46.461410 kubelet[2475]: I0213 19:52:46.461381 2475 kubelet.go:312] "Adding apiserver pod source" Feb 13 19:52:46.461410 kubelet[2475]: I0213 19:52:46.461391 2475 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 19:52:46.463861 kubelet[2475]: W0213 19:52:46.463498 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.463861 kubelet[2475]: E0213 19:52:46.463532 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.463861 kubelet[2475]: W0213 19:52:46.463598 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.463861 kubelet[2475]: E0213 19:52:46.463633 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.463861 kubelet[2475]: I0213 19:52:46.463691 2475 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 19:52:46.465474 kubelet[2475]: I0213 19:52:46.464845 2475 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 19:52:46.465474 kubelet[2475]: W0213 19:52:46.464908 2475 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Feb 13 19:52:46.465474 kubelet[2475]: I0213 19:52:46.465400 2475 server.go:1264] "Started kubelet" Feb 13 19:52:46.469771 kubelet[2475]: I0213 19:52:46.469494 2475 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 19:52:46.471646 kubelet[2475]: I0213 19:52:46.471148 2475 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 19:52:46.471646 kubelet[2475]: I0213 19:52:46.471186 2475 server.go:455] "Adding debug handlers to kubelet server" Feb 13 19:52:46.471646 kubelet[2475]: I0213 19:52:46.471368 2475 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 19:52:46.471646 kubelet[2475]: E0213 19:52:46.471476 2475 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.104:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.104:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.1823dc8a065b2542 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-02-13 19:52:46.465377602 +0000 UTC m=+0.597112512,LastTimestamp:2025-02-13 19:52:46.465377602 +0000 UTC m=+0.597112512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Feb 13 19:52:46.472826 kubelet[2475]: I0213 19:52:46.472372 2475 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 19:52:46.476008 kubelet[2475]: E0213 19:52:46.475954 2475 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:52:46.476058 kubelet[2475]: I0213 19:52:46.476052 2475 volume_manager.go:291] "Starting Kubelet Volume Manager" Feb 13 19:52:46.476189 kubelet[2475]: I0213 19:52:46.476146 2475 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 19:52:46.476250 kubelet[2475]: I0213 19:52:46.476246 2475 reconciler.go:26] "Reconciler: start to sync state" Feb 13 19:52:46.476535 kubelet[2475]: W0213 19:52:46.476514 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.476604 kubelet[2475]: E0213 19:52:46.476597 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.476867 kubelet[2475]: E0213 19:52:46.476773 2475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="200ms" Feb 13 19:52:46.479454 kubelet[2475]: I0213 19:52:46.479442 2475 factory.go:221] Registration of the systemd container factory successfully Feb 13 19:52:46.479776 kubelet[2475]: I0213 19:52:46.479562 2475 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 19:52:46.480492 kubelet[2475]: I0213 19:52:46.480482 2475 factory.go:221] Registration of the containerd container factory successfully Feb 13 19:52:46.481785 kubelet[2475]: E0213 19:52:46.481768 2475 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 19:52:46.489545 kubelet[2475]: I0213 19:52:46.489513 2475 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 19:52:46.490246 kubelet[2475]: I0213 19:52:46.490232 2475 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 19:52:46.490285 kubelet[2475]: I0213 19:52:46.490253 2475 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 19:52:46.490285 kubelet[2475]: I0213 19:52:46.490269 2475 kubelet.go:2337] "Starting kubelet main sync loop" Feb 13 19:52:46.490331 kubelet[2475]: E0213 19:52:46.490294 2475 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 19:52:46.494995 kubelet[2475]: W0213 19:52:46.494959 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.495083 kubelet[2475]: E0213 19:52:46.495014 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:46.505303 kubelet[2475]: I0213 19:52:46.505278 2475 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 19:52:46.505303 kubelet[2475]: I0213 19:52:46.505294 2475 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 19:52:46.505303 kubelet[2475]: I0213 19:52:46.505304 2475 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:52:46.507243 kubelet[2475]: I0213 19:52:46.507180 2475 policy_none.go:49] "None policy: Start" Feb 13 19:52:46.507662 kubelet[2475]: I0213 19:52:46.507647 2475 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 19:52:46.507662 kubelet[2475]: I0213 19:52:46.507663 2475 state_mem.go:35] "Initializing new in-memory state store" Feb 13 19:52:46.513282 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Feb 13 19:52:46.529583 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Feb 13 19:52:46.532363 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Feb 13 19:52:46.540802 kubelet[2475]: I0213 19:52:46.540536 2475 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 19:52:46.540802 kubelet[2475]: I0213 19:52:46.540653 2475 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 19:52:46.540802 kubelet[2475]: I0213 19:52:46.540718 2475 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 19:52:46.541558 kubelet[2475]: E0213 19:52:46.541547 2475 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Feb 13 19:52:46.577315 kubelet[2475]: I0213 19:52:46.577300 2475 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 19:52:46.577697 kubelet[2475]: E0213 19:52:46.577668 2475 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Feb 13 19:52:46.590928 kubelet[2475]: I0213 19:52:46.590911 2475 topology_manager.go:215] "Topology Admit Handler" podUID="48028343106967711af6e81bafc48212" podNamespace="kube-system" podName="kube-apiserver-localhost" Feb 13 19:52:46.591590 kubelet[2475]: I0213 19:52:46.591569 2475 topology_manager.go:215] "Topology Admit Handler" podUID="dd3721fb1a67092819e35b40473f4063" podNamespace="kube-system" podName="kube-controller-manager-localhost" Feb 13 19:52:46.592596 kubelet[2475]: I0213 19:52:46.592257 2475 topology_manager.go:215] "Topology Admit Handler" podUID="8d610d6c43052dbc8df47eb68906a982" podNamespace="kube-system" podName="kube-scheduler-localhost" Feb 13 19:52:46.597184 systemd[1]: Created slice kubepods-burstable-pod48028343106967711af6e81bafc48212.slice - libcontainer container kubepods-burstable-pod48028343106967711af6e81bafc48212.slice. Feb 13 19:52:46.622486 systemd[1]: Created slice kubepods-burstable-poddd3721fb1a67092819e35b40473f4063.slice - libcontainer container kubepods-burstable-poddd3721fb1a67092819e35b40473f4063.slice. Feb 13 19:52:46.631502 systemd[1]: Created slice kubepods-burstable-pod8d610d6c43052dbc8df47eb68906a982.slice - libcontainer container kubepods-burstable-pod8d610d6c43052dbc8df47eb68906a982.slice. Feb 13 19:52:46.677664 kubelet[2475]: E0213 19:52:46.677624 2475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="400ms" Feb 13 19:52:46.778338 kubelet[2475]: I0213 19:52:46.778115 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:46.778338 kubelet[2475]: I0213 19:52:46.778151 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8d610d6c43052dbc8df47eb68906a982-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8d610d6c43052dbc8df47eb68906a982\") " pod="kube-system/kube-scheduler-localhost" Feb 13 19:52:46.778338 kubelet[2475]: I0213 19:52:46.778165 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:46.778338 kubelet[2475]: I0213 19:52:46.778181 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/48028343106967711af6e81bafc48212-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"48028343106967711af6e81bafc48212\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:52:46.778338 kubelet[2475]: I0213 19:52:46.778196 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/48028343106967711af6e81bafc48212-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"48028343106967711af6e81bafc48212\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:52:46.778549 kubelet[2475]: I0213 19:52:46.778209 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/48028343106967711af6e81bafc48212-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"48028343106967711af6e81bafc48212\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:52:46.778549 kubelet[2475]: I0213 19:52:46.778221 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:46.778549 kubelet[2475]: I0213 19:52:46.778232 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:46.778549 kubelet[2475]: I0213 19:52:46.778280 2475 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:46.778879 kubelet[2475]: I0213 19:52:46.778865 2475 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 19:52:46.779249 kubelet[2475]: E0213 19:52:46.779222 2475 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Feb 13 19:52:46.920655 containerd[1536]: time="2025-02-13T19:52:46.920556918Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:48028343106967711af6e81bafc48212,Namespace:kube-system,Attempt:0,}" Feb 13 19:52:46.929121 containerd[1536]: time="2025-02-13T19:52:46.929077247Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:dd3721fb1a67092819e35b40473f4063,Namespace:kube-system,Attempt:0,}" Feb 13 19:52:46.934582 containerd[1536]: time="2025-02-13T19:52:46.934535602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8d610d6c43052dbc8df47eb68906a982,Namespace:kube-system,Attempt:0,}" Feb 13 19:52:47.078623 kubelet[2475]: E0213 19:52:47.078590 2475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="800ms" Feb 13 19:52:47.181022 kubelet[2475]: I0213 19:52:47.180921 2475 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 19:52:47.181398 kubelet[2475]: E0213 19:52:47.181380 2475 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Feb 13 19:52:47.390380 kubelet[2475]: W0213 19:52:47.390323 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.390380 kubelet[2475]: E0213 19:52:47.390365 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.104:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.494222 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1818209481.mount: Deactivated successfully. Feb 13 19:52:47.519068 containerd[1536]: time="2025-02-13T19:52:47.519024628Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:52:47.531609 containerd[1536]: time="2025-02-13T19:52:47.531584248Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Feb 13 19:52:47.548434 containerd[1536]: time="2025-02-13T19:52:47.548407687Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:52:47.553484 containerd[1536]: time="2025-02-13T19:52:47.553457667Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:52:47.557937 containerd[1536]: time="2025-02-13T19:52:47.557908758Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 19:52:47.566781 containerd[1536]: time="2025-02-13T19:52:47.566690226Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:52:47.567706 containerd[1536]: time="2025-02-13T19:52:47.567399061Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 638.245243ms" Feb 13 19:52:47.569374 containerd[1536]: time="2025-02-13T19:52:47.569351831Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Feb 13 19:52:47.572764 containerd[1536]: time="2025-02-13T19:52:47.571963367Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Feb 13 19:52:47.573093 containerd[1536]: time="2025-02-13T19:52:47.573071588Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 652.455311ms" Feb 13 19:52:47.588614 containerd[1536]: time="2025-02-13T19:52:47.588515729Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 653.878165ms" Feb 13 19:52:47.633031 kubelet[2475]: W0213 19:52:47.632953 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.633031 kubelet[2475]: E0213 19:52:47.633020 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.104:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.668506 kubelet[2475]: W0213 19:52:47.668358 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.668506 kubelet[2475]: E0213 19:52:47.668397 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.104:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.753273 containerd[1536]: time="2025-02-13T19:52:47.753080035Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:52:47.753273 containerd[1536]: time="2025-02-13T19:52:47.753130948Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:52:47.753273 containerd[1536]: time="2025-02-13T19:52:47.753159123Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:52:47.753453 containerd[1536]: time="2025-02-13T19:52:47.753215096Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:52:47.753811 containerd[1536]: time="2025-02-13T19:52:47.753750569Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:52:47.753862 containerd[1536]: time="2025-02-13T19:52:47.753814773Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:52:47.753862 containerd[1536]: time="2025-02-13T19:52:47.753826031Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:52:47.753917 containerd[1536]: time="2025-02-13T19:52:47.753870411Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:52:47.759195 containerd[1536]: time="2025-02-13T19:52:47.752046331Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:52:47.759396 containerd[1536]: time="2025-02-13T19:52:47.759189210Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:52:47.759396 containerd[1536]: time="2025-02-13T19:52:47.759204890Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:52:47.759396 containerd[1536]: time="2025-02-13T19:52:47.759247245Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:52:47.761228 kubelet[2475]: W0213 19:52:47.761168 2475 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.761228 kubelet[2475]: E0213 19:52:47.761214 2475 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.104:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.104:6443: connect: connection refused Feb 13 19:52:47.775134 systemd[1]: Started cri-containerd-d0ba0a295c0b4a77522347d6aa375c7e99b22e3a34fc0680318b4d28e384ab7c.scope - libcontainer container d0ba0a295c0b4a77522347d6aa375c7e99b22e3a34fc0680318b4d28e384ab7c. Feb 13 19:52:47.779636 systemd[1]: Started cri-containerd-138cb036d3f572cf10cdb92b09f12fffd6bfa2f2c4a1dab6db7bd5cd6e7f08a1.scope - libcontainer container 138cb036d3f572cf10cdb92b09f12fffd6bfa2f2c4a1dab6db7bd5cd6e7f08a1. Feb 13 19:52:47.781945 systemd[1]: Started cri-containerd-d5e18cbb4172c2bb5e1319014565296fe69b308e69a723c5c7c7413263eaf3e8.scope - libcontainer container d5e18cbb4172c2bb5e1319014565296fe69b308e69a723c5c7c7413263eaf3e8. Feb 13 19:52:47.823000 containerd[1536]: time="2025-02-13T19:52:47.821558099Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:48028343106967711af6e81bafc48212,Namespace:kube-system,Attempt:0,} returns sandbox id \"138cb036d3f572cf10cdb92b09f12fffd6bfa2f2c4a1dab6db7bd5cd6e7f08a1\"" Feb 13 19:52:47.826995 containerd[1536]: time="2025-02-13T19:52:47.825343019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:dd3721fb1a67092819e35b40473f4063,Namespace:kube-system,Attempt:0,} returns sandbox id \"d0ba0a295c0b4a77522347d6aa375c7e99b22e3a34fc0680318b4d28e384ab7c\"" Feb 13 19:52:47.826995 containerd[1536]: time="2025-02-13T19:52:47.826507479Z" level=info msg="CreateContainer within sandbox \"138cb036d3f572cf10cdb92b09f12fffd6bfa2f2c4a1dab6db7bd5cd6e7f08a1\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Feb 13 19:52:47.826995 containerd[1536]: time="2025-02-13T19:52:47.826562432Z" level=info msg="CreateContainer within sandbox \"d0ba0a295c0b4a77522347d6aa375c7e99b22e3a34fc0680318b4d28e384ab7c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Feb 13 19:52:47.836235 containerd[1536]: time="2025-02-13T19:52:47.836146567Z" level=info msg="CreateContainer within sandbox \"d0ba0a295c0b4a77522347d6aa375c7e99b22e3a34fc0680318b4d28e384ab7c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"4e0bdd5f1fbde851dde85d25a8e8fd8b7a9edee44e3b8ccf1e771e3a68d29439\"" Feb 13 19:52:47.836753 containerd[1536]: time="2025-02-13T19:52:47.836742026Z" level=info msg="StartContainer for \"4e0bdd5f1fbde851dde85d25a8e8fd8b7a9edee44e3b8ccf1e771e3a68d29439\"" Feb 13 19:52:47.836963 containerd[1536]: time="2025-02-13T19:52:47.836836271Z" level=info msg="CreateContainer within sandbox \"138cb036d3f572cf10cdb92b09f12fffd6bfa2f2c4a1dab6db7bd5cd6e7f08a1\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9a2b6912f8705fbd78cd96079a87af76506ee5b4b8a8919db22ea72555ee5f53\"" Feb 13 19:52:47.838578 containerd[1536]: time="2025-02-13T19:52:47.838558506Z" level=info msg="StartContainer for \"9a2b6912f8705fbd78cd96079a87af76506ee5b4b8a8919db22ea72555ee5f53\"" Feb 13 19:52:47.839408 containerd[1536]: time="2025-02-13T19:52:47.839396686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8d610d6c43052dbc8df47eb68906a982,Namespace:kube-system,Attempt:0,} returns sandbox id \"d5e18cbb4172c2bb5e1319014565296fe69b308e69a723c5c7c7413263eaf3e8\"" Feb 13 19:52:47.840960 containerd[1536]: time="2025-02-13T19:52:47.840948132Z" level=info msg="CreateContainer within sandbox \"d5e18cbb4172c2bb5e1319014565296fe69b308e69a723c5c7c7413263eaf3e8\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Feb 13 19:52:47.851366 containerd[1536]: time="2025-02-13T19:52:47.851340456Z" level=info msg="CreateContainer within sandbox \"d5e18cbb4172c2bb5e1319014565296fe69b308e69a723c5c7c7413263eaf3e8\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"93ebf7d230e35fd6ad9861397f415e625f36600d0310cc2430ba1163f0c77d44\"" Feb 13 19:52:47.852990 containerd[1536]: time="2025-02-13T19:52:47.852953256Z" level=info msg="StartContainer for \"93ebf7d230e35fd6ad9861397f415e625f36600d0310cc2430ba1163f0c77d44\"" Feb 13 19:52:47.861080 systemd[1]: Started cri-containerd-9a2b6912f8705fbd78cd96079a87af76506ee5b4b8a8919db22ea72555ee5f53.scope - libcontainer container 9a2b6912f8705fbd78cd96079a87af76506ee5b4b8a8919db22ea72555ee5f53. Feb 13 19:52:47.864044 systemd[1]: Started cri-containerd-4e0bdd5f1fbde851dde85d25a8e8fd8b7a9edee44e3b8ccf1e771e3a68d29439.scope - libcontainer container 4e0bdd5f1fbde851dde85d25a8e8fd8b7a9edee44e3b8ccf1e771e3a68d29439. Feb 13 19:52:47.880716 kubelet[2475]: E0213 19:52:47.880685 2475 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.104:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.104:6443: connect: connection refused" interval="1.6s" Feb 13 19:52:47.883193 systemd[1]: Started cri-containerd-93ebf7d230e35fd6ad9861397f415e625f36600d0310cc2430ba1163f0c77d44.scope - libcontainer container 93ebf7d230e35fd6ad9861397f415e625f36600d0310cc2430ba1163f0c77d44. Feb 13 19:52:47.910281 containerd[1536]: time="2025-02-13T19:52:47.910250097Z" level=info msg="StartContainer for \"9a2b6912f8705fbd78cd96079a87af76506ee5b4b8a8919db22ea72555ee5f53\" returns successfully" Feb 13 19:52:47.910383 containerd[1536]: time="2025-02-13T19:52:47.910327025Z" level=info msg="StartContainer for \"4e0bdd5f1fbde851dde85d25a8e8fd8b7a9edee44e3b8ccf1e771e3a68d29439\" returns successfully" Feb 13 19:52:47.931909 containerd[1536]: time="2025-02-13T19:52:47.931879816Z" level=info msg="StartContainer for \"93ebf7d230e35fd6ad9861397f415e625f36600d0310cc2430ba1163f0c77d44\" returns successfully" Feb 13 19:52:47.982290 kubelet[2475]: I0213 19:52:47.982271 2475 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 19:52:47.982565 kubelet[2475]: E0213 19:52:47.982551 2475 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.104:6443/api/v1/nodes\": dial tcp 139.178.70.104:6443: connect: connection refused" node="localhost" Feb 13 19:52:49.482892 kubelet[2475]: E0213 19:52:49.482863 2475 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Feb 13 19:52:49.563442 kubelet[2475]: E0213 19:52:49.563416 2475 csi_plugin.go:308] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "localhost" not found Feb 13 19:52:49.583898 kubelet[2475]: I0213 19:52:49.583880 2475 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 19:52:49.591761 kubelet[2475]: I0213 19:52:49.591745 2475 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Feb 13 19:52:49.597439 kubelet[2475]: E0213 19:52:49.597419 2475 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:52:49.697922 kubelet[2475]: E0213 19:52:49.697900 2475 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:52:49.798493 kubelet[2475]: E0213 19:52:49.798424 2475 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:52:49.898908 kubelet[2475]: E0213 19:52:49.898877 2475 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Feb 13 19:52:50.465663 kubelet[2475]: I0213 19:52:50.465534 2475 apiserver.go:52] "Watching apiserver" Feb 13 19:52:50.476893 kubelet[2475]: I0213 19:52:50.476884 2475 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 19:52:50.917478 systemd[1]: Reloading requested from client PID 2750 ('systemctl') (unit session-9.scope)... Feb 13 19:52:50.917488 systemd[1]: Reloading... Feb 13 19:52:50.980048 zram_generator::config[2797]: No configuration found. Feb 13 19:52:51.037343 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Feb 13 19:52:51.053500 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Feb 13 19:52:51.105769 systemd[1]: Reloading finished in 187 ms. Feb 13 19:52:51.132754 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:51.133234 kubelet[2475]: I0213 19:52:51.132873 2475 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 19:52:51.143628 systemd[1]: kubelet.service: Deactivated successfully. Feb 13 19:52:51.143756 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:51.150175 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Feb 13 19:52:51.312326 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Feb 13 19:52:51.315307 (kubelet)[2855]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Feb 13 19:52:51.491550 kubelet[2855]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:52:51.491789 kubelet[2855]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 13 19:52:51.491826 kubelet[2855]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 13 19:52:51.491920 kubelet[2855]: I0213 19:52:51.491895 2855 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 13 19:52:51.494844 kubelet[2855]: I0213 19:52:51.494832 2855 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Feb 13 19:52:51.494919 kubelet[2855]: I0213 19:52:51.494912 2855 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 13 19:52:51.495093 kubelet[2855]: I0213 19:52:51.495084 2855 server.go:927] "Client rotation is on, will bootstrap in background" Feb 13 19:52:51.496007 kubelet[2855]: I0213 19:52:51.495927 2855 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 13 19:52:51.496730 kubelet[2855]: I0213 19:52:51.496713 2855 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Feb 13 19:52:51.502130 kubelet[2855]: I0213 19:52:51.502118 2855 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Feb 13 19:52:51.502760 kubelet[2855]: I0213 19:52:51.502619 2855 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 13 19:52:51.502760 kubelet[2855]: I0213 19:52:51.502639 2855 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Feb 13 19:52:51.502990 kubelet[2855]: I0213 19:52:51.502917 2855 topology_manager.go:138] "Creating topology manager with none policy" Feb 13 19:52:51.502990 kubelet[2855]: I0213 19:52:51.502930 2855 container_manager_linux.go:301] "Creating device plugin manager" Feb 13 19:52:51.502990 kubelet[2855]: I0213 19:52:51.502956 2855 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:52:51.503169 kubelet[2855]: I0213 19:52:51.503097 2855 kubelet.go:400] "Attempting to sync node with API server" Feb 13 19:52:51.503169 kubelet[2855]: I0213 19:52:51.503109 2855 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 13 19:52:51.517135 kubelet[2855]: I0213 19:52:51.517116 2855 kubelet.go:312] "Adding apiserver pod source" Feb 13 19:52:51.517135 kubelet[2855]: I0213 19:52:51.517142 2855 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 13 19:52:51.529601 kubelet[2855]: I0213 19:52:51.529396 2855 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Feb 13 19:52:51.529601 kubelet[2855]: I0213 19:52:51.529518 2855 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 13 19:52:51.529900 kubelet[2855]: I0213 19:52:51.529756 2855 server.go:1264] "Started kubelet" Feb 13 19:52:51.539019 kubelet[2855]: I0213 19:52:51.539007 2855 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 13 19:52:51.546554 kubelet[2855]: I0213 19:52:51.546521 2855 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 13 19:52:51.546709 kubelet[2855]: I0213 19:52:51.546696 2855 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 13 19:52:51.546735 kubelet[2855]: I0213 19:52:51.546721 2855 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 13 19:52:51.548068 kubelet[2855]: I0213 19:52:51.547451 2855 server.go:455] "Adding debug handlers to kubelet server" Feb 13 19:52:51.548236 kubelet[2855]: I0213 19:52:51.548227 2855 volume_manager.go:291] "Starting Kubelet Volume Manager" Feb 13 19:52:51.548278 kubelet[2855]: I0213 19:52:51.548268 2855 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Feb 13 19:52:51.548335 kubelet[2855]: I0213 19:52:51.548328 2855 reconciler.go:26] "Reconciler: start to sync state" Feb 13 19:52:51.549369 kubelet[2855]: I0213 19:52:51.549300 2855 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Feb 13 19:52:51.550491 kubelet[2855]: I0213 19:52:51.550479 2855 factory.go:221] Registration of the containerd container factory successfully Feb 13 19:52:51.550491 kubelet[2855]: I0213 19:52:51.550488 2855 factory.go:221] Registration of the systemd container factory successfully Feb 13 19:52:51.551502 kubelet[2855]: E0213 19:52:51.551270 2855 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Feb 13 19:52:51.572780 kubelet[2855]: I0213 19:52:51.572199 2855 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 13 19:52:51.573455 kubelet[2855]: I0213 19:52:51.573440 2855 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 13 19:52:51.573495 kubelet[2855]: I0213 19:52:51.573456 2855 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 13 19:52:51.573495 kubelet[2855]: I0213 19:52:51.573467 2855 kubelet.go:2337] "Starting kubelet main sync loop" Feb 13 19:52:51.573530 kubelet[2855]: E0213 19:52:51.573493 2855 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 13 19:52:51.590776 kubelet[2855]: I0213 19:52:51.590757 2855 cpu_manager.go:214] "Starting CPU manager" policy="none" Feb 13 19:52:51.590776 kubelet[2855]: I0213 19:52:51.590769 2855 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Feb 13 19:52:51.590877 kubelet[2855]: I0213 19:52:51.590801 2855 state_mem.go:36] "Initialized new in-memory state store" Feb 13 19:52:51.590895 kubelet[2855]: I0213 19:52:51.590885 2855 state_mem.go:88] "Updated default CPUSet" cpuSet="" Feb 13 19:52:51.590913 kubelet[2855]: I0213 19:52:51.590891 2855 state_mem.go:96] "Updated CPUSet assignments" assignments={} Feb 13 19:52:51.590913 kubelet[2855]: I0213 19:52:51.590903 2855 policy_none.go:49] "None policy: Start" Feb 13 19:52:51.591349 kubelet[2855]: I0213 19:52:51.591338 2855 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 13 19:52:51.591380 kubelet[2855]: I0213 19:52:51.591350 2855 state_mem.go:35] "Initializing new in-memory state store" Feb 13 19:52:51.591445 kubelet[2855]: I0213 19:52:51.591434 2855 state_mem.go:75] "Updated machine memory state" Feb 13 19:52:51.593631 kubelet[2855]: I0213 19:52:51.593619 2855 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 13 19:52:51.593828 kubelet[2855]: I0213 19:52:51.593702 2855 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 13 19:52:51.593828 kubelet[2855]: I0213 19:52:51.593750 2855 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 13 19:52:51.649666 kubelet[2855]: I0213 19:52:51.649651 2855 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Feb 13 19:52:51.672468 kubelet[2855]: I0213 19:52:51.672331 2855 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Feb 13 19:52:51.672468 kubelet[2855]: I0213 19:52:51.672390 2855 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Feb 13 19:52:51.673526 kubelet[2855]: I0213 19:52:51.673515 2855 topology_manager.go:215] "Topology Admit Handler" podUID="48028343106967711af6e81bafc48212" podNamespace="kube-system" podName="kube-apiserver-localhost" Feb 13 19:52:51.673896 kubelet[2855]: I0213 19:52:51.673603 2855 topology_manager.go:215] "Topology Admit Handler" podUID="dd3721fb1a67092819e35b40473f4063" podNamespace="kube-system" podName="kube-controller-manager-localhost" Feb 13 19:52:51.673896 kubelet[2855]: I0213 19:52:51.673650 2855 topology_manager.go:215] "Topology Admit Handler" podUID="8d610d6c43052dbc8df47eb68906a982" podNamespace="kube-system" podName="kube-scheduler-localhost" Feb 13 19:52:51.849910 kubelet[2855]: I0213 19:52:51.849818 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/48028343106967711af6e81bafc48212-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"48028343106967711af6e81bafc48212\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:52:51.849910 kubelet[2855]: I0213 19:52:51.849865 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:51.849910 kubelet[2855]: I0213 19:52:51.849877 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:51.849910 kubelet[2855]: I0213 19:52:51.849894 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/48028343106967711af6e81bafc48212-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"48028343106967711af6e81bafc48212\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:52:51.849910 kubelet[2855]: I0213 19:52:51.849904 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8d610d6c43052dbc8df47eb68906a982-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8d610d6c43052dbc8df47eb68906a982\") " pod="kube-system/kube-scheduler-localhost" Feb 13 19:52:51.850094 kubelet[2855]: I0213 19:52:51.849912 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/48028343106967711af6e81bafc48212-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"48028343106967711af6e81bafc48212\") " pod="kube-system/kube-apiserver-localhost" Feb 13 19:52:51.850094 kubelet[2855]: I0213 19:52:51.849922 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:51.850094 kubelet[2855]: I0213 19:52:51.849930 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:51.850094 kubelet[2855]: I0213 19:52:51.849939 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dd3721fb1a67092819e35b40473f4063-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"dd3721fb1a67092819e35b40473f4063\") " pod="kube-system/kube-controller-manager-localhost" Feb 13 19:52:52.525657 kubelet[2855]: I0213 19:52:52.525493 2855 apiserver.go:52] "Watching apiserver" Feb 13 19:52:52.622275 kubelet[2855]: I0213 19:52:52.622160 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.62205904 podStartE2EDuration="1.62205904s" podCreationTimestamp="2025-02-13 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:52:52.607055076 +0000 UTC m=+1.169689471" watchObservedRunningTime="2025-02-13 19:52:52.62205904 +0000 UTC m=+1.184693435" Feb 13 19:52:52.640182 kubelet[2855]: I0213 19:52:52.640045 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.640029657 podStartE2EDuration="1.640029657s" podCreationTimestamp="2025-02-13 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:52:52.622267022 +0000 UTC m=+1.184901419" watchObservedRunningTime="2025-02-13 19:52:52.640029657 +0000 UTC m=+1.202664059" Feb 13 19:52:52.649303 kubelet[2855]: I0213 19:52:52.649263 2855 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Feb 13 19:52:55.624167 sudo[1849]: pam_unix(sudo:session): session closed for user root Feb 13 19:52:55.625229 sshd[1848]: Connection closed by 147.75.109.163 port 45998 Feb 13 19:52:55.633785 sshd-session[1846]: pam_unix(sshd:session): session closed for user core Feb 13 19:52:55.636449 systemd[1]: sshd@6-139.178.70.104:22-147.75.109.163:45998.service: Deactivated successfully. Feb 13 19:52:55.637666 systemd[1]: session-9.scope: Deactivated successfully. Feb 13 19:52:55.637856 systemd[1]: session-9.scope: Consumed 3.069s CPU time, 184.1M memory peak, 0B memory swap peak. Feb 13 19:52:55.638247 systemd-logind[1517]: Session 9 logged out. Waiting for processes to exit. Feb 13 19:52:55.638819 systemd-logind[1517]: Removed session 9. Feb 13 19:52:57.921327 kubelet[2855]: I0213 19:52:57.921165 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=6.921154823 podStartE2EDuration="6.921154823s" podCreationTimestamp="2025-02-13 19:52:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:52:52.640606142 +0000 UTC m=+1.203240556" watchObservedRunningTime="2025-02-13 19:52:57.921154823 +0000 UTC m=+6.483789221" Feb 13 19:53:06.833942 kubelet[2855]: I0213 19:53:06.833886 2855 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Feb 13 19:53:06.834454 containerd[1536]: time="2025-02-13T19:53:06.834344017Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Feb 13 19:53:06.834644 kubelet[2855]: I0213 19:53:06.834457 2855 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Feb 13 19:53:07.681821 kubelet[2855]: I0213 19:53:07.681786 2855 topology_manager.go:215] "Topology Admit Handler" podUID="ad72c888-7d62-4ca1-ba30-8f03a3c50b5b" podNamespace="kube-system" podName="kube-proxy-jsmbd" Feb 13 19:53:07.688499 systemd[1]: Created slice kubepods-besteffort-podad72c888_7d62_4ca1_ba30_8f03a3c50b5b.slice - libcontainer container kubepods-besteffort-podad72c888_7d62_4ca1_ba30_8f03a3c50b5b.slice. Feb 13 19:53:07.766673 kubelet[2855]: I0213 19:53:07.766558 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ad72c888-7d62-4ca1-ba30-8f03a3c50b5b-kube-proxy\") pod \"kube-proxy-jsmbd\" (UID: \"ad72c888-7d62-4ca1-ba30-8f03a3c50b5b\") " pod="kube-system/kube-proxy-jsmbd" Feb 13 19:53:07.766673 kubelet[2855]: I0213 19:53:07.766605 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbkzc\" (UniqueName: \"kubernetes.io/projected/ad72c888-7d62-4ca1-ba30-8f03a3c50b5b-kube-api-access-qbkzc\") pod \"kube-proxy-jsmbd\" (UID: \"ad72c888-7d62-4ca1-ba30-8f03a3c50b5b\") " pod="kube-system/kube-proxy-jsmbd" Feb 13 19:53:07.766673 kubelet[2855]: I0213 19:53:07.766623 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ad72c888-7d62-4ca1-ba30-8f03a3c50b5b-xtables-lock\") pod \"kube-proxy-jsmbd\" (UID: \"ad72c888-7d62-4ca1-ba30-8f03a3c50b5b\") " pod="kube-system/kube-proxy-jsmbd" Feb 13 19:53:07.766673 kubelet[2855]: I0213 19:53:07.766634 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ad72c888-7d62-4ca1-ba30-8f03a3c50b5b-lib-modules\") pod \"kube-proxy-jsmbd\" (UID: \"ad72c888-7d62-4ca1-ba30-8f03a3c50b5b\") " pod="kube-system/kube-proxy-jsmbd" Feb 13 19:53:07.844546 kubelet[2855]: I0213 19:53:07.844134 2855 topology_manager.go:215] "Topology Admit Handler" podUID="b46daded-3e76-4a4f-8f99-07e7ec22a583" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-dzt8j" Feb 13 19:53:07.850517 systemd[1]: Created slice kubepods-besteffort-podb46daded_3e76_4a4f_8f99_07e7ec22a583.slice - libcontainer container kubepods-besteffort-podb46daded_3e76_4a4f_8f99_07e7ec22a583.slice. Feb 13 19:53:07.867527 kubelet[2855]: I0213 19:53:07.867320 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b46daded-3e76-4a4f-8f99-07e7ec22a583-var-lib-calico\") pod \"tigera-operator-7bc55997bb-dzt8j\" (UID: \"b46daded-3e76-4a4f-8f99-07e7ec22a583\") " pod="tigera-operator/tigera-operator-7bc55997bb-dzt8j" Feb 13 19:53:07.867527 kubelet[2855]: I0213 19:53:07.867359 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vppcg\" (UniqueName: \"kubernetes.io/projected/b46daded-3e76-4a4f-8f99-07e7ec22a583-kube-api-access-vppcg\") pod \"tigera-operator-7bc55997bb-dzt8j\" (UID: \"b46daded-3e76-4a4f-8f99-07e7ec22a583\") " pod="tigera-operator/tigera-operator-7bc55997bb-dzt8j" Feb 13 19:53:07.996131 containerd[1536]: time="2025-02-13T19:53:07.996068663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jsmbd,Uid:ad72c888-7d62-4ca1-ba30-8f03a3c50b5b,Namespace:kube-system,Attempt:0,}" Feb 13 19:53:08.011150 containerd[1536]: time="2025-02-13T19:53:08.011039139Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:08.011150 containerd[1536]: time="2025-02-13T19:53:08.011082475Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:08.011150 containerd[1536]: time="2025-02-13T19:53:08.011092256Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:08.011373 containerd[1536]: time="2025-02-13T19:53:08.011210070Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:08.027093 systemd[1]: Started cri-containerd-736dca951b514663896ac74b08afc1a347ae37ac83a0d90dca1b8d8ae5762a71.scope - libcontainer container 736dca951b514663896ac74b08afc1a347ae37ac83a0d90dca1b8d8ae5762a71. Feb 13 19:53:08.040510 containerd[1536]: time="2025-02-13T19:53:08.040478595Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-jsmbd,Uid:ad72c888-7d62-4ca1-ba30-8f03a3c50b5b,Namespace:kube-system,Attempt:0,} returns sandbox id \"736dca951b514663896ac74b08afc1a347ae37ac83a0d90dca1b8d8ae5762a71\"" Feb 13 19:53:08.042574 containerd[1536]: time="2025-02-13T19:53:08.042558361Z" level=info msg="CreateContainer within sandbox \"736dca951b514663896ac74b08afc1a347ae37ac83a0d90dca1b8d8ae5762a71\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Feb 13 19:53:08.052754 containerd[1536]: time="2025-02-13T19:53:08.052683203Z" level=info msg="CreateContainer within sandbox \"736dca951b514663896ac74b08afc1a347ae37ac83a0d90dca1b8d8ae5762a71\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"22e68afbca55edeb17b1d6d1e4e4eab7fd58749b8f4a3d3afa7887b08d0ebb1b\"" Feb 13 19:53:08.053871 containerd[1536]: time="2025-02-13T19:53:08.053132957Z" level=info msg="StartContainer for \"22e68afbca55edeb17b1d6d1e4e4eab7fd58749b8f4a3d3afa7887b08d0ebb1b\"" Feb 13 19:53:08.071086 systemd[1]: Started cri-containerd-22e68afbca55edeb17b1d6d1e4e4eab7fd58749b8f4a3d3afa7887b08d0ebb1b.scope - libcontainer container 22e68afbca55edeb17b1d6d1e4e4eab7fd58749b8f4a3d3afa7887b08d0ebb1b. Feb 13 19:53:08.087780 containerd[1536]: time="2025-02-13T19:53:08.087749515Z" level=info msg="StartContainer for \"22e68afbca55edeb17b1d6d1e4e4eab7fd58749b8f4a3d3afa7887b08d0ebb1b\" returns successfully" Feb 13 19:53:08.154730 containerd[1536]: time="2025-02-13T19:53:08.154698157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-dzt8j,Uid:b46daded-3e76-4a4f-8f99-07e7ec22a583,Namespace:tigera-operator,Attempt:0,}" Feb 13 19:53:08.175882 containerd[1536]: time="2025-02-13T19:53:08.175830919Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:08.176450 containerd[1536]: time="2025-02-13T19:53:08.176160079Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:08.176641 containerd[1536]: time="2025-02-13T19:53:08.176537836Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:08.176641 containerd[1536]: time="2025-02-13T19:53:08.176591326Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:08.191122 systemd[1]: Started cri-containerd-ebaf176b4126dd0419ebc92c23a1e3d7454a728c51fcc6c22774d72688334683.scope - libcontainer container ebaf176b4126dd0419ebc92c23a1e3d7454a728c51fcc6c22774d72688334683. Feb 13 19:53:08.217906 containerd[1536]: time="2025-02-13T19:53:08.217835051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-dzt8j,Uid:b46daded-3e76-4a4f-8f99-07e7ec22a583,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"ebaf176b4126dd0419ebc92c23a1e3d7454a728c51fcc6c22774d72688334683\"" Feb 13 19:53:08.219833 containerd[1536]: time="2025-02-13T19:53:08.219811028Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Feb 13 19:53:08.609977 kubelet[2855]: I0213 19:53:08.609750 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-jsmbd" podStartSLOduration=1.6097381579999999 podStartE2EDuration="1.609738158s" podCreationTimestamp="2025-02-13 19:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:53:08.609580727 +0000 UTC m=+17.172215143" watchObservedRunningTime="2025-02-13 19:53:08.609738158 +0000 UTC m=+17.172372560" Feb 13 19:53:09.656054 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount117234383.mount: Deactivated successfully. Feb 13 19:53:10.014231 containerd[1536]: time="2025-02-13T19:53:10.014125651Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:10.014231 containerd[1536]: time="2025-02-13T19:53:10.014162206Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21762497" Feb 13 19:53:10.015025 containerd[1536]: time="2025-02-13T19:53:10.014834366Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:10.022187 containerd[1536]: time="2025-02-13T19:53:10.022160412Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:10.022877 containerd[1536]: time="2025-02-13T19:53:10.022588380Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 1.802632089s" Feb 13 19:53:10.022877 containerd[1536]: time="2025-02-13T19:53:10.022605204Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Feb 13 19:53:10.025523 containerd[1536]: time="2025-02-13T19:53:10.025504960Z" level=info msg="CreateContainer within sandbox \"ebaf176b4126dd0419ebc92c23a1e3d7454a728c51fcc6c22774d72688334683\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Feb 13 19:53:10.043939 containerd[1536]: time="2025-02-13T19:53:10.043915252Z" level=info msg="CreateContainer within sandbox \"ebaf176b4126dd0419ebc92c23a1e3d7454a728c51fcc6c22774d72688334683\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1e4207bd0dc86f3dc789a1a00cbd64f50fffb17c425a20dd6b70016d18ed7eee\"" Feb 13 19:53:10.045030 containerd[1536]: time="2025-02-13T19:53:10.044961024Z" level=info msg="StartContainer for \"1e4207bd0dc86f3dc789a1a00cbd64f50fffb17c425a20dd6b70016d18ed7eee\"" Feb 13 19:53:10.062034 systemd[1]: run-containerd-runc-k8s.io-1e4207bd0dc86f3dc789a1a00cbd64f50fffb17c425a20dd6b70016d18ed7eee-runc.knncNU.mount: Deactivated successfully. Feb 13 19:53:10.067061 systemd[1]: Started cri-containerd-1e4207bd0dc86f3dc789a1a00cbd64f50fffb17c425a20dd6b70016d18ed7eee.scope - libcontainer container 1e4207bd0dc86f3dc789a1a00cbd64f50fffb17c425a20dd6b70016d18ed7eee. Feb 13 19:53:10.081580 containerd[1536]: time="2025-02-13T19:53:10.081553930Z" level=info msg="StartContainer for \"1e4207bd0dc86f3dc789a1a00cbd64f50fffb17c425a20dd6b70016d18ed7eee\" returns successfully" Feb 13 19:53:10.649845 kubelet[2855]: I0213 19:53:10.648646 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-dzt8j" podStartSLOduration=1.84410498 podStartE2EDuration="3.648635239s" podCreationTimestamp="2025-02-13 19:53:07 +0000 UTC" firstStartedPulling="2025-02-13 19:53:08.218494843 +0000 UTC m=+16.781129237" lastFinishedPulling="2025-02-13 19:53:10.023025102 +0000 UTC m=+18.585659496" observedRunningTime="2025-02-13 19:53:10.647957764 +0000 UTC m=+19.210592178" watchObservedRunningTime="2025-02-13 19:53:10.648635239 +0000 UTC m=+19.211269641" Feb 13 19:53:12.802910 kubelet[2855]: I0213 19:53:12.802883 2855 topology_manager.go:215] "Topology Admit Handler" podUID="84707b19-8011-4451-82e1-8cef7db3c9f4" podNamespace="calico-system" podName="calico-typha-98c8984b4-vgr28" Feb 13 19:53:12.810398 systemd[1]: Created slice kubepods-besteffort-pod84707b19_8011_4451_82e1_8cef7db3c9f4.slice - libcontainer container kubepods-besteffort-pod84707b19_8011_4451_82e1_8cef7db3c9f4.slice. Feb 13 19:53:12.813186 kubelet[2855]: W0213 19:53:12.811968 2855 reflector.go:547] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 19:53:12.813186 kubelet[2855]: E0213 19:53:12.813061 2855 reflector.go:150] object-"calico-system"/"typha-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:localhost" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 19:53:12.813186 kubelet[2855]: W0213 19:53:12.812113 2855 reflector.go:547] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 19:53:12.813186 kubelet[2855]: E0213 19:53:12.813077 2855 reflector.go:150] object-"calico-system"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 19:53:12.813186 kubelet[2855]: W0213 19:53:12.812780 2855 reflector.go:547] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 19:53:12.813977 kubelet[2855]: E0213 19:53:12.813091 2855 reflector.go:150] object-"calico-system"/"tigera-ca-bundle": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:localhost" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'localhost' and this object Feb 13 19:53:12.896852 kubelet[2855]: I0213 19:53:12.896774 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84707b19-8011-4451-82e1-8cef7db3c9f4-tigera-ca-bundle\") pod \"calico-typha-98c8984b4-vgr28\" (UID: \"84707b19-8011-4451-82e1-8cef7db3c9f4\") " pod="calico-system/calico-typha-98c8984b4-vgr28" Feb 13 19:53:12.896852 kubelet[2855]: I0213 19:53:12.896805 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/84707b19-8011-4451-82e1-8cef7db3c9f4-typha-certs\") pod \"calico-typha-98c8984b4-vgr28\" (UID: \"84707b19-8011-4451-82e1-8cef7db3c9f4\") " pod="calico-system/calico-typha-98c8984b4-vgr28" Feb 13 19:53:12.896852 kubelet[2855]: I0213 19:53:12.896816 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wx4t\" (UniqueName: \"kubernetes.io/projected/84707b19-8011-4451-82e1-8cef7db3c9f4-kube-api-access-7wx4t\") pod \"calico-typha-98c8984b4-vgr28\" (UID: \"84707b19-8011-4451-82e1-8cef7db3c9f4\") " pod="calico-system/calico-typha-98c8984b4-vgr28" Feb 13 19:53:12.917051 kubelet[2855]: I0213 19:53:12.916088 2855 topology_manager.go:215] "Topology Admit Handler" podUID="dfcc6f11-1602-40b0-8c53-2959bb6a9537" podNamespace="calico-system" podName="calico-node-mhzzt" Feb 13 19:53:12.923754 systemd[1]: Created slice kubepods-besteffort-poddfcc6f11_1602_40b0_8c53_2959bb6a9537.slice - libcontainer container kubepods-besteffort-poddfcc6f11_1602_40b0_8c53_2959bb6a9537.slice. Feb 13 19:53:12.997522 kubelet[2855]: I0213 19:53:12.997482 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-var-run-calico\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.997885 kubelet[2855]: I0213 19:53:12.997675 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-flexvol-driver-host\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.997885 kubelet[2855]: I0213 19:53:12.997693 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-lib-modules\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.997885 kubelet[2855]: I0213 19:53:12.997729 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dfcc6f11-1602-40b0-8c53-2959bb6a9537-tigera-ca-bundle\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.997885 kubelet[2855]: I0213 19:53:12.997742 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-cni-log-dir\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.997885 kubelet[2855]: I0213 19:53:12.997761 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqzpd\" (UniqueName: \"kubernetes.io/projected/dfcc6f11-1602-40b0-8c53-2959bb6a9537-kube-api-access-nqzpd\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.998442 kubelet[2855]: I0213 19:53:12.998272 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-xtables-lock\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.998442 kubelet[2855]: I0213 19:53:12.998313 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-cni-net-dir\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.998442 kubelet[2855]: I0213 19:53:12.998336 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-policysync\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.998442 kubelet[2855]: I0213 19:53:12.998354 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/dfcc6f11-1602-40b0-8c53-2959bb6a9537-node-certs\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.998442 kubelet[2855]: I0213 19:53:12.998426 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-var-lib-calico\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:12.999353 kubelet[2855]: I0213 19:53:12.999310 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/dfcc6f11-1602-40b0-8c53-2959bb6a9537-cni-bin-dir\") pod \"calico-node-mhzzt\" (UID: \"dfcc6f11-1602-40b0-8c53-2959bb6a9537\") " pod="calico-system/calico-node-mhzzt" Feb 13 19:53:13.027750 kubelet[2855]: I0213 19:53:13.027374 2855 topology_manager.go:215] "Topology Admit Handler" podUID="34af207b-f303-480e-a684-e96e850daafd" podNamespace="calico-system" podName="csi-node-driver-kzqv2" Feb 13 19:53:13.027750 kubelet[2855]: E0213 19:53:13.027609 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:13.099522 kubelet[2855]: I0213 19:53:13.099457 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34af207b-f303-480e-a684-e96e850daafd-kubelet-dir\") pod \"csi-node-driver-kzqv2\" (UID: \"34af207b-f303-480e-a684-e96e850daafd\") " pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:13.099691 kubelet[2855]: I0213 19:53:13.099679 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p58p\" (UniqueName: \"kubernetes.io/projected/34af207b-f303-480e-a684-e96e850daafd-kube-api-access-6p58p\") pod \"csi-node-driver-kzqv2\" (UID: \"34af207b-f303-480e-a684-e96e850daafd\") " pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:13.100560 kubelet[2855]: I0213 19:53:13.099992 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/34af207b-f303-480e-a684-e96e850daafd-varrun\") pod \"csi-node-driver-kzqv2\" (UID: \"34af207b-f303-480e-a684-e96e850daafd\") " pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:13.100560 kubelet[2855]: I0213 19:53:13.100011 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/34af207b-f303-480e-a684-e96e850daafd-registration-dir\") pod \"csi-node-driver-kzqv2\" (UID: \"34af207b-f303-480e-a684-e96e850daafd\") " pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:13.100560 kubelet[2855]: I0213 19:53:13.100036 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/34af207b-f303-480e-a684-e96e850daafd-socket-dir\") pod \"csi-node-driver-kzqv2\" (UID: \"34af207b-f303-480e-a684-e96e850daafd\") " pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:13.105798 kubelet[2855]: E0213 19:53:13.105765 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.105798 kubelet[2855]: W0213 19:53:13.105784 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.106036 kubelet[2855]: E0213 19:53:13.105803 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.106348 kubelet[2855]: E0213 19:53:13.106201 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.106348 kubelet[2855]: W0213 19:53:13.106209 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.106348 kubelet[2855]: E0213 19:53:13.106276 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.106432 kubelet[2855]: E0213 19:53:13.106365 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.106432 kubelet[2855]: W0213 19:53:13.106377 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.106432 kubelet[2855]: E0213 19:53:13.106427 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.106814 kubelet[2855]: E0213 19:53:13.106593 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.106814 kubelet[2855]: W0213 19:53:13.106598 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.106814 kubelet[2855]: E0213 19:53:13.106655 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.107343 kubelet[2855]: E0213 19:53:13.106928 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.107343 kubelet[2855]: W0213 19:53:13.106935 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.107343 kubelet[2855]: E0213 19:53:13.107287 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.107473 kubelet[2855]: E0213 19:53:13.107360 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.107473 kubelet[2855]: W0213 19:53:13.107366 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.107473 kubelet[2855]: E0213 19:53:13.107438 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.107564 kubelet[2855]: E0213 19:53:13.107485 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.107564 kubelet[2855]: W0213 19:53:13.107489 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.107564 kubelet[2855]: E0213 19:53:13.107544 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.109751 kubelet[2855]: E0213 19:53:13.107819 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.109751 kubelet[2855]: W0213 19:53:13.107825 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.109751 kubelet[2855]: E0213 19:53:13.107851 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.109751 kubelet[2855]: E0213 19:53:13.109736 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.109751 kubelet[2855]: W0213 19:53:13.109747 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.109836 kubelet[2855]: E0213 19:53:13.109766 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110060 kubelet[2855]: E0213 19:53:13.109876 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110060 kubelet[2855]: W0213 19:53:13.109885 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110060 kubelet[2855]: E0213 19:53:13.109970 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110060 kubelet[2855]: W0213 19:53:13.109974 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110060 kubelet[2855]: E0213 19:53:13.110002 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110060 kubelet[2855]: E0213 19:53:13.110027 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110200 kubelet[2855]: E0213 19:53:13.110105 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110200 kubelet[2855]: W0213 19:53:13.110111 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110200 kubelet[2855]: E0213 19:53:13.110123 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110253 kubelet[2855]: E0213 19:53:13.110248 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110333 kubelet[2855]: W0213 19:53:13.110253 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110333 kubelet[2855]: E0213 19:53:13.110264 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110406 kubelet[2855]: E0213 19:53:13.110399 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110502 kubelet[2855]: W0213 19:53:13.110435 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110502 kubelet[2855]: E0213 19:53:13.110451 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110577 kubelet[2855]: E0213 19:53:13.110571 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110612 kubelet[2855]: W0213 19:53:13.110606 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110647 kubelet[2855]: E0213 19:53:13.110642 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110781 kubelet[2855]: E0213 19:53:13.110769 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110781 kubelet[2855]: W0213 19:53:13.110779 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110833 kubelet[2855]: E0213 19:53:13.110791 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.110910 kubelet[2855]: E0213 19:53:13.110898 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.110939 kubelet[2855]: W0213 19:53:13.110911 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.110939 kubelet[2855]: E0213 19:53:13.110921 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.111103 kubelet[2855]: E0213 19:53:13.111092 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.111134 kubelet[2855]: W0213 19:53:13.111102 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.111134 kubelet[2855]: E0213 19:53:13.111114 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.111225 kubelet[2855]: E0213 19:53:13.111217 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.111225 kubelet[2855]: W0213 19:53:13.111224 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.111274 kubelet[2855]: E0213 19:53:13.111238 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.111351 kubelet[2855]: E0213 19:53:13.111342 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.111351 kubelet[2855]: W0213 19:53:13.111350 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.111458 kubelet[2855]: E0213 19:53:13.111419 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.111486 kubelet[2855]: E0213 19:53:13.111477 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.111486 kubelet[2855]: W0213 19:53:13.111482 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.111562 kubelet[2855]: E0213 19:53:13.111549 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.111603 kubelet[2855]: E0213 19:53:13.111588 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.111603 kubelet[2855]: W0213 19:53:13.111592 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.111679 kubelet[2855]: E0213 19:53:13.111607 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.111725 kubelet[2855]: E0213 19:53:13.111712 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.111767 kubelet[2855]: W0213 19:53:13.111731 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.111767 kubelet[2855]: E0213 19:53:13.111738 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201000 kubelet[2855]: E0213 19:53:13.200967 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201090 kubelet[2855]: W0213 19:53:13.201008 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201090 kubelet[2855]: E0213 19:53:13.201024 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201226 kubelet[2855]: E0213 19:53:13.201217 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201226 kubelet[2855]: W0213 19:53:13.201225 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201266 kubelet[2855]: E0213 19:53:13.201233 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201371 kubelet[2855]: E0213 19:53:13.201360 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201371 kubelet[2855]: W0213 19:53:13.201367 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201491 kubelet[2855]: E0213 19:53:13.201375 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201491 kubelet[2855]: E0213 19:53:13.201475 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201491 kubelet[2855]: W0213 19:53:13.201479 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201553 kubelet[2855]: E0213 19:53:13.201492 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201602 kubelet[2855]: E0213 19:53:13.201591 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201628 kubelet[2855]: W0213 19:53:13.201599 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201628 kubelet[2855]: E0213 19:53:13.201624 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201723 kubelet[2855]: E0213 19:53:13.201713 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201723 kubelet[2855]: W0213 19:53:13.201721 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201723 kubelet[2855]: E0213 19:53:13.201725 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201821 kubelet[2855]: E0213 19:53:13.201805 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201821 kubelet[2855]: W0213 19:53:13.201810 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201821 kubelet[2855]: E0213 19:53:13.201821 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.201929 kubelet[2855]: E0213 19:53:13.201919 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.201929 kubelet[2855]: W0213 19:53:13.201926 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.201972 kubelet[2855]: E0213 19:53:13.201933 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.202038 kubelet[2855]: E0213 19:53:13.202027 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.202078 kubelet[2855]: W0213 19:53:13.202039 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.202078 kubelet[2855]: E0213 19:53:13.202046 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.202139 kubelet[2855]: E0213 19:53:13.202125 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.202139 kubelet[2855]: W0213 19:53:13.202129 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.202139 kubelet[2855]: E0213 19:53:13.202139 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.202237 kubelet[2855]: E0213 19:53:13.202227 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.202237 kubelet[2855]: W0213 19:53:13.202234 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.202282 kubelet[2855]: E0213 19:53:13.202241 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.202333 kubelet[2855]: E0213 19:53:13.202323 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.202333 kubelet[2855]: W0213 19:53:13.202330 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.202378 kubelet[2855]: E0213 19:53:13.202341 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.202441 kubelet[2855]: E0213 19:53:13.202432 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.202441 kubelet[2855]: W0213 19:53:13.202439 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.202528 kubelet[2855]: E0213 19:53:13.202518 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.202563 kubelet[2855]: E0213 19:53:13.202554 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.202563 kubelet[2855]: W0213 19:53:13.202560 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.202635 kubelet[2855]: E0213 19:53:13.202610 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.202635 kubelet[2855]: E0213 19:53:13.202633 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208064 kubelet[2855]: W0213 19:53:13.202637 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208064 kubelet[2855]: E0213 19:53:13.202711 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208064 kubelet[2855]: E0213 19:53:13.202733 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208064 kubelet[2855]: W0213 19:53:13.202736 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208064 kubelet[2855]: E0213 19:53:13.202762 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208064 kubelet[2855]: E0213 19:53:13.202813 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208064 kubelet[2855]: W0213 19:53:13.202817 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208064 kubelet[2855]: E0213 19:53:13.202828 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208064 kubelet[2855]: E0213 19:53:13.202934 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208064 kubelet[2855]: W0213 19:53:13.202939 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208321 kubelet[2855]: E0213 19:53:13.202947 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208321 kubelet[2855]: E0213 19:53:13.203088 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208321 kubelet[2855]: W0213 19:53:13.203094 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208321 kubelet[2855]: E0213 19:53:13.203104 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208321 kubelet[2855]: E0213 19:53:13.203219 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208321 kubelet[2855]: W0213 19:53:13.203224 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208321 kubelet[2855]: E0213 19:53:13.203233 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208321 kubelet[2855]: E0213 19:53:13.203332 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208321 kubelet[2855]: W0213 19:53:13.203336 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208321 kubelet[2855]: E0213 19:53:13.203345 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208488 kubelet[2855]: E0213 19:53:13.203434 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208488 kubelet[2855]: W0213 19:53:13.203439 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208488 kubelet[2855]: E0213 19:53:13.203447 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208488 kubelet[2855]: E0213 19:53:13.203537 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208488 kubelet[2855]: W0213 19:53:13.203541 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208488 kubelet[2855]: E0213 19:53:13.203548 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208488 kubelet[2855]: E0213 19:53:13.203657 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208488 kubelet[2855]: W0213 19:53:13.203668 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208488 kubelet[2855]: E0213 19:53:13.203674 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208488 kubelet[2855]: E0213 19:53:13.203765 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208644 kubelet[2855]: W0213 19:53:13.203773 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208644 kubelet[2855]: E0213 19:53:13.203782 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208644 kubelet[2855]: E0213 19:53:13.203883 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208644 kubelet[2855]: W0213 19:53:13.203888 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208644 kubelet[2855]: E0213 19:53:13.203895 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208644 kubelet[2855]: E0213 19:53:13.204038 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208644 kubelet[2855]: W0213 19:53:13.204049 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208644 kubelet[2855]: E0213 19:53:13.204056 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208644 kubelet[2855]: E0213 19:53:13.204156 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208644 kubelet[2855]: W0213 19:53:13.204164 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208800 kubelet[2855]: E0213 19:53:13.204170 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.208800 kubelet[2855]: E0213 19:53:13.204399 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.208800 kubelet[2855]: W0213 19:53:13.204405 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.208800 kubelet[2855]: E0213 19:53:13.204411 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.209085 kubelet[2855]: E0213 19:53:13.208974 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.209085 kubelet[2855]: W0213 19:53:13.209020 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.209085 kubelet[2855]: E0213 19:53:13.209033 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.302755 kubelet[2855]: E0213 19:53:13.302655 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.302755 kubelet[2855]: W0213 19:53:13.302674 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.302755 kubelet[2855]: E0213 19:53:13.302690 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.302926 kubelet[2855]: E0213 19:53:13.302840 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.302926 kubelet[2855]: W0213 19:53:13.302846 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.302926 kubelet[2855]: E0213 19:53:13.302854 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.303062 kubelet[2855]: E0213 19:53:13.302972 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.303062 kubelet[2855]: W0213 19:53:13.302977 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.303062 kubelet[2855]: E0213 19:53:13.303007 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.303159 kubelet[2855]: E0213 19:53:13.303136 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.303159 kubelet[2855]: W0213 19:53:13.303142 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.303159 kubelet[2855]: E0213 19:53:13.303148 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.303273 kubelet[2855]: E0213 19:53:13.303261 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.303273 kubelet[2855]: W0213 19:53:13.303269 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.303331 kubelet[2855]: E0213 19:53:13.303275 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.303390 kubelet[2855]: E0213 19:53:13.303383 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.303390 kubelet[2855]: W0213 19:53:13.303390 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.303443 kubelet[2855]: E0213 19:53:13.303396 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.404123 kubelet[2855]: E0213 19:53:13.404025 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.404123 kubelet[2855]: W0213 19:53:13.404043 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.404123 kubelet[2855]: E0213 19:53:13.404058 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.404726 kubelet[2855]: E0213 19:53:13.404568 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.404726 kubelet[2855]: W0213 19:53:13.404578 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.404726 kubelet[2855]: E0213 19:53:13.404587 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.404926 kubelet[2855]: E0213 19:53:13.404847 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.404926 kubelet[2855]: W0213 19:53:13.404855 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.404926 kubelet[2855]: E0213 19:53:13.404862 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.405064 kubelet[2855]: E0213 19:53:13.405056 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.405175 kubelet[2855]: W0213 19:53:13.405105 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.405175 kubelet[2855]: E0213 19:53:13.405115 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.405270 kubelet[2855]: E0213 19:53:13.405263 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.405315 kubelet[2855]: W0213 19:53:13.405307 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.405358 kubelet[2855]: E0213 19:53:13.405351 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.405552 kubelet[2855]: E0213 19:53:13.405515 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.405552 kubelet[2855]: W0213 19:53:13.405523 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.405552 kubelet[2855]: E0213 19:53:13.405529 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.506365 kubelet[2855]: E0213 19:53:13.506275 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.506365 kubelet[2855]: W0213 19:53:13.506292 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.506365 kubelet[2855]: E0213 19:53:13.506307 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.506560 kubelet[2855]: E0213 19:53:13.506452 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.506560 kubelet[2855]: W0213 19:53:13.506459 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.506560 kubelet[2855]: E0213 19:53:13.506467 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.506683 kubelet[2855]: E0213 19:53:13.506590 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.506683 kubelet[2855]: W0213 19:53:13.506596 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.506683 kubelet[2855]: E0213 19:53:13.506602 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.506781 kubelet[2855]: E0213 19:53:13.506712 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.506781 kubelet[2855]: W0213 19:53:13.506718 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.506781 kubelet[2855]: E0213 19:53:13.506726 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.506881 kubelet[2855]: E0213 19:53:13.506832 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.506881 kubelet[2855]: W0213 19:53:13.506837 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.506881 kubelet[2855]: E0213 19:53:13.506844 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.507043 kubelet[2855]: E0213 19:53:13.506969 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.507043 kubelet[2855]: W0213 19:53:13.506974 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.507043 kubelet[2855]: E0213 19:53:13.506990 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.608329 kubelet[2855]: E0213 19:53:13.608309 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.608329 kubelet[2855]: W0213 19:53:13.608323 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.608329 kubelet[2855]: E0213 19:53:13.608335 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.608457 kubelet[2855]: E0213 19:53:13.608432 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.608457 kubelet[2855]: W0213 19:53:13.608437 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.608457 kubelet[2855]: E0213 19:53:13.608442 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.608535 kubelet[2855]: E0213 19:53:13.608531 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.608535 kubelet[2855]: W0213 19:53:13.608535 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.608574 kubelet[2855]: E0213 19:53:13.608540 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.608640 kubelet[2855]: E0213 19:53:13.608630 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.608640 kubelet[2855]: W0213 19:53:13.608638 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.608683 kubelet[2855]: E0213 19:53:13.608643 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.608759 kubelet[2855]: E0213 19:53:13.608749 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.608759 kubelet[2855]: W0213 19:53:13.608757 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.608801 kubelet[2855]: E0213 19:53:13.608763 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.608855 kubelet[2855]: E0213 19:53:13.608846 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.608855 kubelet[2855]: W0213 19:53:13.608853 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.608895 kubelet[2855]: E0213 19:53:13.608858 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.666447 kubelet[2855]: E0213 19:53:13.666342 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.666447 kubelet[2855]: W0213 19:53:13.666359 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.666447 kubelet[2855]: E0213 19:53:13.666373 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.709318 kubelet[2855]: E0213 19:53:13.709194 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.709318 kubelet[2855]: W0213 19:53:13.709206 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.709318 kubelet[2855]: E0213 19:53:13.709218 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.709318 kubelet[2855]: E0213 19:53:13.709317 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.709318 kubelet[2855]: W0213 19:53:13.709322 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.709518 kubelet[2855]: E0213 19:53:13.709327 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.709518 kubelet[2855]: E0213 19:53:13.709408 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.709518 kubelet[2855]: W0213 19:53:13.709414 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.709518 kubelet[2855]: E0213 19:53:13.709419 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.709518 kubelet[2855]: E0213 19:53:13.709500 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.709518 kubelet[2855]: W0213 19:53:13.709505 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.709518 kubelet[2855]: E0213 19:53:13.709509 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.709666 kubelet[2855]: E0213 19:53:13.709593 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.709666 kubelet[2855]: W0213 19:53:13.709597 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.709666 kubelet[2855]: E0213 19:53:13.709602 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.810576 kubelet[2855]: E0213 19:53:13.810553 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.810576 kubelet[2855]: W0213 19:53:13.810569 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.810880 kubelet[2855]: E0213 19:53:13.810584 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.810880 kubelet[2855]: E0213 19:53:13.810727 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.810880 kubelet[2855]: W0213 19:53:13.810733 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.810880 kubelet[2855]: E0213 19:53:13.810738 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.810880 kubelet[2855]: E0213 19:53:13.810828 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.810880 kubelet[2855]: W0213 19:53:13.810832 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.810880 kubelet[2855]: E0213 19:53:13.810837 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.811075 kubelet[2855]: E0213 19:53:13.810917 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.811075 kubelet[2855]: W0213 19:53:13.810923 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.811075 kubelet[2855]: E0213 19:53:13.810928 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.811075 kubelet[2855]: E0213 19:53:13.811025 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.811075 kubelet[2855]: W0213 19:53:13.811029 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.811075 kubelet[2855]: E0213 19:53:13.811034 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.911926 kubelet[2855]: E0213 19:53:13.911839 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.911926 kubelet[2855]: W0213 19:53:13.911855 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.911926 kubelet[2855]: E0213 19:53:13.911869 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.912167 kubelet[2855]: E0213 19:53:13.912003 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.912167 kubelet[2855]: W0213 19:53:13.912009 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.912167 kubelet[2855]: E0213 19:53:13.912014 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.912167 kubelet[2855]: E0213 19:53:13.912105 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.912167 kubelet[2855]: W0213 19:53:13.912110 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.912167 kubelet[2855]: E0213 19:53:13.912115 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.912323 kubelet[2855]: E0213 19:53:13.912213 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.912323 kubelet[2855]: W0213 19:53:13.912218 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.912323 kubelet[2855]: E0213 19:53:13.912223 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.912323 kubelet[2855]: E0213 19:53:13.912308 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.912323 kubelet[2855]: W0213 19:53:13.912312 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.912323 kubelet[2855]: E0213 19:53:13.912317 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.965684 kubelet[2855]: E0213 19:53:13.965348 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.965684 kubelet[2855]: W0213 19:53:13.965362 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.965684 kubelet[2855]: E0213 19:53:13.965377 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.968995 kubelet[2855]: E0213 19:53:13.966732 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.969066 kubelet[2855]: W0213 19:53:13.969056 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.969118 kubelet[2855]: E0213 19:53:13.969110 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.971145 kubelet[2855]: E0213 19:53:13.971129 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:13.971145 kubelet[2855]: W0213 19:53:13.971141 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:13.971215 kubelet[2855]: E0213 19:53:13.971151 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:13.997908 kubelet[2855]: E0213 19:53:13.997847 2855 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 13 19:53:13.998017 kubelet[2855]: E0213 19:53:13.997933 2855 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/84707b19-8011-4451-82e1-8cef7db3c9f4-tigera-ca-bundle podName:84707b19-8011-4451-82e1-8cef7db3c9f4 nodeName:}" failed. No retries permitted until 2025-02-13 19:53:14.497901452 +0000 UTC m=+23.060535850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/84707b19-8011-4451-82e1-8cef7db3c9f4-tigera-ca-bundle") pod "calico-typha-98c8984b4-vgr28" (UID: "84707b19-8011-4451-82e1-8cef7db3c9f4") : failed to sync configmap cache: timed out waiting for the condition Feb 13 19:53:14.013139 kubelet[2855]: E0213 19:53:14.013118 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.013139 kubelet[2855]: W0213 19:53:14.013134 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.013268 kubelet[2855]: E0213 19:53:14.013150 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.013299 kubelet[2855]: E0213 19:53:14.013292 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.013324 kubelet[2855]: W0213 19:53:14.013298 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.013324 kubelet[2855]: E0213 19:53:14.013305 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.100061 kubelet[2855]: E0213 19:53:14.100031 2855 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Feb 13 19:53:14.100173 kubelet[2855]: E0213 19:53:14.100105 2855 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfcc6f11-1602-40b0-8c53-2959bb6a9537-tigera-ca-bundle podName:dfcc6f11-1602-40b0-8c53-2959bb6a9537 nodeName:}" failed. No retries permitted until 2025-02-13 19:53:14.60008949 +0000 UTC m=+23.162723888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/dfcc6f11-1602-40b0-8c53-2959bb6a9537-tigera-ca-bundle") pod "calico-node-mhzzt" (UID: "dfcc6f11-1602-40b0-8c53-2959bb6a9537") : failed to sync configmap cache: timed out waiting for the condition Feb 13 19:53:14.114679 kubelet[2855]: E0213 19:53:14.114561 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.114679 kubelet[2855]: W0213 19:53:14.114578 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.114679 kubelet[2855]: E0213 19:53:14.114593 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.114866 kubelet[2855]: E0213 19:53:14.114760 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.114866 kubelet[2855]: W0213 19:53:14.114767 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.114866 kubelet[2855]: E0213 19:53:14.114774 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.216124 kubelet[2855]: E0213 19:53:14.216035 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.216124 kubelet[2855]: W0213 19:53:14.216053 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.216361 kubelet[2855]: E0213 19:53:14.216066 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.216467 kubelet[2855]: E0213 19:53:14.216418 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.216467 kubelet[2855]: W0213 19:53:14.216423 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.216467 kubelet[2855]: E0213 19:53:14.216435 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.317817 kubelet[2855]: E0213 19:53:14.317783 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.317817 kubelet[2855]: W0213 19:53:14.317800 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.317817 kubelet[2855]: E0213 19:53:14.317816 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.318077 kubelet[2855]: E0213 19:53:14.317938 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.318077 kubelet[2855]: W0213 19:53:14.317943 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.318077 kubelet[2855]: E0213 19:53:14.317949 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.419198 kubelet[2855]: E0213 19:53:14.419139 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.419198 kubelet[2855]: W0213 19:53:14.419158 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.419198 kubelet[2855]: E0213 19:53:14.419173 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.419375 kubelet[2855]: E0213 19:53:14.419329 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.419375 kubelet[2855]: W0213 19:53:14.419336 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.419375 kubelet[2855]: E0213 19:53:14.419345 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.520583 kubelet[2855]: E0213 19:53:14.520466 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.520583 kubelet[2855]: W0213 19:53:14.520485 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.520583 kubelet[2855]: E0213 19:53:14.520500 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.520827 kubelet[2855]: E0213 19:53:14.520681 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.520827 kubelet[2855]: W0213 19:53:14.520688 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.520827 kubelet[2855]: E0213 19:53:14.520703 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.521406 kubelet[2855]: E0213 19:53:14.521326 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.521406 kubelet[2855]: W0213 19:53:14.521367 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.521406 kubelet[2855]: E0213 19:53:14.521376 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.521620 kubelet[2855]: E0213 19:53:14.521566 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.521620 kubelet[2855]: W0213 19:53:14.521572 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.521620 kubelet[2855]: E0213 19:53:14.521578 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.521787 kubelet[2855]: E0213 19:53:14.521721 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.521787 kubelet[2855]: W0213 19:53:14.521726 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.521787 kubelet[2855]: E0213 19:53:14.521731 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.521882 kubelet[2855]: E0213 19:53:14.521877 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.522297 kubelet[2855]: W0213 19:53:14.521913 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.522297 kubelet[2855]: E0213 19:53:14.521920 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.522531 kubelet[2855]: E0213 19:53:14.522524 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.522570 kubelet[2855]: W0213 19:53:14.522564 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.522604 kubelet[2855]: E0213 19:53:14.522597 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.574197 kubelet[2855]: E0213 19:53:14.574163 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:14.616404 containerd[1536]: time="2025-02-13T19:53:14.616317299Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-98c8984b4-vgr28,Uid:84707b19-8011-4451-82e1-8cef7db3c9f4,Namespace:calico-system,Attempt:0,}" Feb 13 19:53:14.622648 kubelet[2855]: E0213 19:53:14.622306 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.622648 kubelet[2855]: W0213 19:53:14.622321 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.622648 kubelet[2855]: E0213 19:53:14.622337 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.622648 kubelet[2855]: E0213 19:53:14.622463 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.622648 kubelet[2855]: W0213 19:53:14.622467 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.622648 kubelet[2855]: E0213 19:53:14.622473 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.622648 kubelet[2855]: E0213 19:53:14.622553 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.622648 kubelet[2855]: W0213 19:53:14.622558 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.622648 kubelet[2855]: E0213 19:53:14.622562 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.622648 kubelet[2855]: E0213 19:53:14.622645 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.623807 kubelet[2855]: W0213 19:53:14.622650 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.623807 kubelet[2855]: E0213 19:53:14.622654 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.623807 kubelet[2855]: E0213 19:53:14.622758 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.623807 kubelet[2855]: W0213 19:53:14.622763 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.623807 kubelet[2855]: E0213 19:53:14.622768 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.623807 kubelet[2855]: E0213 19:53:14.623603 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:14.623807 kubelet[2855]: W0213 19:53:14.623609 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:14.623807 kubelet[2855]: E0213 19:53:14.623615 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:14.631418 containerd[1536]: time="2025-02-13T19:53:14.631249109Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:14.631418 containerd[1536]: time="2025-02-13T19:53:14.631293612Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:14.631418 containerd[1536]: time="2025-02-13T19:53:14.631302544Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:14.631418 containerd[1536]: time="2025-02-13T19:53:14.631358899Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:14.651118 systemd[1]: Started cri-containerd-07c10b63d8e797db69a5473230008d41f43dea5f02f2de258004870c70c7a845.scope - libcontainer container 07c10b63d8e797db69a5473230008d41f43dea5f02f2de258004870c70c7a845. Feb 13 19:53:14.678872 containerd[1536]: time="2025-02-13T19:53:14.678821185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-98c8984b4-vgr28,Uid:84707b19-8011-4451-82e1-8cef7db3c9f4,Namespace:calico-system,Attempt:0,} returns sandbox id \"07c10b63d8e797db69a5473230008d41f43dea5f02f2de258004870c70c7a845\"" Feb 13 19:53:14.679812 containerd[1536]: time="2025-02-13T19:53:14.679788693Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Feb 13 19:53:14.726087 containerd[1536]: time="2025-02-13T19:53:14.725972125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mhzzt,Uid:dfcc6f11-1602-40b0-8c53-2959bb6a9537,Namespace:calico-system,Attempt:0,}" Feb 13 19:53:14.758633 containerd[1536]: time="2025-02-13T19:53:14.758456249Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:14.758633 containerd[1536]: time="2025-02-13T19:53:14.758491096Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:14.758633 containerd[1536]: time="2025-02-13T19:53:14.758498682Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:14.758633 containerd[1536]: time="2025-02-13T19:53:14.758544263Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:14.777116 systemd[1]: Started cri-containerd-85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54.scope - libcontainer container 85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54. Feb 13 19:53:14.792927 containerd[1536]: time="2025-02-13T19:53:14.792874598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mhzzt,Uid:dfcc6f11-1602-40b0-8c53-2959bb6a9537,Namespace:calico-system,Attempt:0,} returns sandbox id \"85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54\"" Feb 13 19:53:16.527817 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount448058784.mount: Deactivated successfully. Feb 13 19:53:16.574170 kubelet[2855]: E0213 19:53:16.573894 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:16.892642 containerd[1536]: time="2025-02-13T19:53:16.892142616Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:16.906382 containerd[1536]: time="2025-02-13T19:53:16.906358505Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Feb 13 19:53:16.915049 containerd[1536]: time="2025-02-13T19:53:16.915023657Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:16.919906 containerd[1536]: time="2025-02-13T19:53:16.919884285Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:16.920526 containerd[1536]: time="2025-02-13T19:53:16.920251290Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.240445242s" Feb 13 19:53:16.920526 containerd[1536]: time="2025-02-13T19:53:16.920272489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Feb 13 19:53:16.921399 containerd[1536]: time="2025-02-13T19:53:16.921368985Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Feb 13 19:53:16.930680 containerd[1536]: time="2025-02-13T19:53:16.930552291Z" level=info msg="CreateContainer within sandbox \"07c10b63d8e797db69a5473230008d41f43dea5f02f2de258004870c70c7a845\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Feb 13 19:53:16.971846 containerd[1536]: time="2025-02-13T19:53:16.971807361Z" level=info msg="CreateContainer within sandbox \"07c10b63d8e797db69a5473230008d41f43dea5f02f2de258004870c70c7a845\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"27693a5d9ae9fdbcbc0634c7d331f6f126e38805020127cc66628fd06b4e6d53\"" Feb 13 19:53:16.972365 containerd[1536]: time="2025-02-13T19:53:16.972348072Z" level=info msg="StartContainer for \"27693a5d9ae9fdbcbc0634c7d331f6f126e38805020127cc66628fd06b4e6d53\"" Feb 13 19:53:17.028080 systemd[1]: Started cri-containerd-27693a5d9ae9fdbcbc0634c7d331f6f126e38805020127cc66628fd06b4e6d53.scope - libcontainer container 27693a5d9ae9fdbcbc0634c7d331f6f126e38805020127cc66628fd06b4e6d53. Feb 13 19:53:17.071837 containerd[1536]: time="2025-02-13T19:53:17.071810702Z" level=info msg="StartContainer for \"27693a5d9ae9fdbcbc0634c7d331f6f126e38805020127cc66628fd06b4e6d53\" returns successfully" Feb 13 19:53:17.717624 kubelet[2855]: E0213 19:53:17.717601 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.717624 kubelet[2855]: W0213 19:53:17.717619 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.725612 kubelet[2855]: E0213 19:53:17.717634 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.725612 kubelet[2855]: E0213 19:53:17.717736 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.725612 kubelet[2855]: W0213 19:53:17.717741 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.725612 kubelet[2855]: E0213 19:53:17.717746 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.725612 kubelet[2855]: E0213 19:53:17.717825 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.725612 kubelet[2855]: W0213 19:53:17.717829 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.725612 kubelet[2855]: E0213 19:53:17.717833 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.725612 kubelet[2855]: E0213 19:53:17.717932 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.725612 kubelet[2855]: W0213 19:53:17.717936 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.725612 kubelet[2855]: E0213 19:53:17.717941 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.735773 kubelet[2855]: E0213 19:53:17.718042 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.735773 kubelet[2855]: W0213 19:53:17.718046 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.735773 kubelet[2855]: E0213 19:53:17.718051 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.735773 kubelet[2855]: E0213 19:53:17.718130 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.735773 kubelet[2855]: W0213 19:53:17.718134 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.735773 kubelet[2855]: E0213 19:53:17.718139 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.735773 kubelet[2855]: E0213 19:53:17.718229 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.735773 kubelet[2855]: W0213 19:53:17.718233 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.735773 kubelet[2855]: E0213 19:53:17.718238 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.735773 kubelet[2855]: E0213 19:53:17.718340 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.736089 kubelet[2855]: W0213 19:53:17.718345 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.736089 kubelet[2855]: E0213 19:53:17.718349 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.736089 kubelet[2855]: E0213 19:53:17.718454 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.736089 kubelet[2855]: W0213 19:53:17.718459 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.736089 kubelet[2855]: E0213 19:53:17.718464 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.736089 kubelet[2855]: E0213 19:53:17.718548 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.736089 kubelet[2855]: W0213 19:53:17.718553 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.736089 kubelet[2855]: E0213 19:53:17.718557 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.736089 kubelet[2855]: E0213 19:53:17.718642 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.736089 kubelet[2855]: W0213 19:53:17.718646 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.741293 kubelet[2855]: E0213 19:53:17.718651 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.741293 kubelet[2855]: E0213 19:53:17.718744 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.741293 kubelet[2855]: W0213 19:53:17.718749 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.741293 kubelet[2855]: E0213 19:53:17.718754 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.741293 kubelet[2855]: E0213 19:53:17.718854 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.741293 kubelet[2855]: W0213 19:53:17.718860 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.741293 kubelet[2855]: E0213 19:53:17.718865 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.741293 kubelet[2855]: E0213 19:53:17.718951 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.741293 kubelet[2855]: W0213 19:53:17.718956 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.741293 kubelet[2855]: E0213 19:53:17.718960 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.741628 kubelet[2855]: E0213 19:53:17.719065 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.741628 kubelet[2855]: W0213 19:53:17.719070 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.741628 kubelet[2855]: E0213 19:53:17.719075 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.742489 kubelet[2855]: E0213 19:53:17.742417 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.742489 kubelet[2855]: W0213 19:53:17.742430 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.742489 kubelet[2855]: E0213 19:53:17.742444 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.742837 kubelet[2855]: E0213 19:53:17.742813 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.742837 kubelet[2855]: W0213 19:53:17.742834 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.743000 kubelet[2855]: E0213 19:53:17.742845 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.743394 kubelet[2855]: E0213 19:53:17.743255 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.743394 kubelet[2855]: W0213 19:53:17.743269 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.743394 kubelet[2855]: E0213 19:53:17.743285 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.743757 kubelet[2855]: E0213 19:53:17.743639 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.743757 kubelet[2855]: W0213 19:53:17.743647 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.743757 kubelet[2855]: E0213 19:53:17.743661 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.744201 kubelet[2855]: E0213 19:53:17.744019 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.744201 kubelet[2855]: W0213 19:53:17.744027 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.744201 kubelet[2855]: E0213 19:53:17.744038 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.744814 kubelet[2855]: E0213 19:53:17.744396 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.745069 kubelet[2855]: W0213 19:53:17.744994 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.745127 kubelet[2855]: E0213 19:53:17.745118 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.745413 kubelet[2855]: E0213 19:53:17.745405 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.745474 kubelet[2855]: W0213 19:53:17.745454 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.745642 kubelet[2855]: E0213 19:53:17.745533 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.745723 kubelet[2855]: E0213 19:53:17.745703 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.745861 kubelet[2855]: W0213 19:53:17.745763 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.746005 kubelet[2855]: E0213 19:53:17.745942 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.746331 kubelet[2855]: E0213 19:53:17.746150 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.746331 kubelet[2855]: W0213 19:53:17.746163 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.746331 kubelet[2855]: E0213 19:53:17.746190 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.746605 kubelet[2855]: E0213 19:53:17.746540 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.746605 kubelet[2855]: W0213 19:53:17.746548 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.746605 kubelet[2855]: E0213 19:53:17.746559 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.747349 kubelet[2855]: E0213 19:53:17.747336 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.747349 kubelet[2855]: W0213 19:53:17.747346 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.747417 kubelet[2855]: E0213 19:53:17.747357 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.747531 kubelet[2855]: E0213 19:53:17.747518 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.747531 kubelet[2855]: W0213 19:53:17.747526 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.747618 kubelet[2855]: E0213 19:53:17.747598 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.747663 kubelet[2855]: E0213 19:53:17.747658 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.747688 kubelet[2855]: W0213 19:53:17.747664 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.747688 kubelet[2855]: E0213 19:53:17.747676 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.747818 kubelet[2855]: E0213 19:53:17.747805 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.747818 kubelet[2855]: W0213 19:53:17.747813 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.747818 kubelet[2855]: E0213 19:53:17.747821 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.748324 kubelet[2855]: E0213 19:53:17.748030 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.748324 kubelet[2855]: W0213 19:53:17.748039 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.748324 kubelet[2855]: E0213 19:53:17.748051 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.748324 kubelet[2855]: E0213 19:53:17.748197 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.748324 kubelet[2855]: W0213 19:53:17.748204 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.748324 kubelet[2855]: E0213 19:53:17.748213 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.748800 kubelet[2855]: E0213 19:53:17.748782 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.748800 kubelet[2855]: W0213 19:53:17.748795 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.748862 kubelet[2855]: E0213 19:53:17.748806 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:17.749167 kubelet[2855]: E0213 19:53:17.749153 2855 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Feb 13 19:53:17.749167 kubelet[2855]: W0213 19:53:17.749162 2855 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Feb 13 19:53:17.749227 kubelet[2855]: E0213 19:53:17.749170 2855 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Feb 13 19:53:18.450145 containerd[1536]: time="2025-02-13T19:53:18.450102404Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:18.452009 containerd[1536]: time="2025-02-13T19:53:18.451968442Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Feb 13 19:53:18.452769 containerd[1536]: time="2025-02-13T19:53:18.452743877Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:18.453833 containerd[1536]: time="2025-02-13T19:53:18.453789301Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:18.454469 containerd[1536]: time="2025-02-13T19:53:18.454236851Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.532838811s" Feb 13 19:53:18.454469 containerd[1536]: time="2025-02-13T19:53:18.454257146Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Feb 13 19:53:18.455998 containerd[1536]: time="2025-02-13T19:53:18.455921818Z" level=info msg="CreateContainer within sandbox \"85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Feb 13 19:53:18.468694 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount589580264.mount: Deactivated successfully. Feb 13 19:53:18.470181 containerd[1536]: time="2025-02-13T19:53:18.470155334Z" level=info msg="CreateContainer within sandbox \"85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8\"" Feb 13 19:53:18.471289 containerd[1536]: time="2025-02-13T19:53:18.470568598Z" level=info msg="StartContainer for \"3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8\"" Feb 13 19:53:18.492076 systemd[1]: Started cri-containerd-3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8.scope - libcontainer container 3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8. Feb 13 19:53:18.517824 containerd[1536]: time="2025-02-13T19:53:18.517797660Z" level=info msg="StartContainer for \"3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8\" returns successfully" Feb 13 19:53:18.520453 systemd[1]: cri-containerd-3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8.scope: Deactivated successfully. Feb 13 19:53:18.574736 kubelet[2855]: E0213 19:53:18.574643 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:18.630368 kubelet[2855]: I0213 19:53:18.630345 2855 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:53:18.677858 kubelet[2855]: I0213 19:53:18.665783 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-98c8984b4-vgr28" podStartSLOduration=4.424515057 podStartE2EDuration="6.66576914s" podCreationTimestamp="2025-02-13 19:53:12 +0000 UTC" firstStartedPulling="2025-02-13 19:53:14.679642081 +0000 UTC m=+23.242276475" lastFinishedPulling="2025-02-13 19:53:16.920896159 +0000 UTC m=+25.483530558" observedRunningTime="2025-02-13 19:53:17.640786212 +0000 UTC m=+26.203420627" watchObservedRunningTime="2025-02-13 19:53:18.66576914 +0000 UTC m=+27.228403543" Feb 13 19:53:18.869494 containerd[1536]: time="2025-02-13T19:53:18.859710501Z" level=info msg="shim disconnected" id=3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8 namespace=k8s.io Feb 13 19:53:18.869494 containerd[1536]: time="2025-02-13T19:53:18.869358078Z" level=warning msg="cleaning up after shim disconnected" id=3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8 namespace=k8s.io Feb 13 19:53:18.869494 containerd[1536]: time="2025-02-13T19:53:18.869367025Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 19:53:18.925588 systemd[1]: run-containerd-runc-k8s.io-3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8-runc.zHN09e.mount: Deactivated successfully. Feb 13 19:53:18.925655 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3e3fe65bfc91ce59f05133a1fed6e3de6e35b4bd05422c537ee40dd14a1da3a8-rootfs.mount: Deactivated successfully. Feb 13 19:53:19.634037 containerd[1536]: time="2025-02-13T19:53:19.633791419Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Feb 13 19:53:20.574919 kubelet[2855]: E0213 19:53:20.574607 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:22.574355 kubelet[2855]: E0213 19:53:22.574291 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:24.574189 kubelet[2855]: E0213 19:53:24.574138 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:24.612393 containerd[1536]: time="2025-02-13T19:53:24.612350662Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:24.617532 containerd[1536]: time="2025-02-13T19:53:24.617488843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Feb 13 19:53:24.622362 containerd[1536]: time="2025-02-13T19:53:24.622337033Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:24.625562 containerd[1536]: time="2025-02-13T19:53:24.625526466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:24.626130 containerd[1536]: time="2025-02-13T19:53:24.625803611Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 4.991982332s" Feb 13 19:53:24.626130 containerd[1536]: time="2025-02-13T19:53:24.625819374Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Feb 13 19:53:24.659152 containerd[1536]: time="2025-02-13T19:53:24.659120913Z" level=info msg="CreateContainer within sandbox \"85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Feb 13 19:53:24.785125 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3501897878.mount: Deactivated successfully. Feb 13 19:53:24.807500 containerd[1536]: time="2025-02-13T19:53:24.807475995Z" level=info msg="CreateContainer within sandbox \"85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b\"" Feb 13 19:53:24.808373 containerd[1536]: time="2025-02-13T19:53:24.808356086Z" level=info msg="StartContainer for \"1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b\"" Feb 13 19:53:24.871172 systemd[1]: Started cri-containerd-1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b.scope - libcontainer container 1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b. Feb 13 19:53:24.896657 containerd[1536]: time="2025-02-13T19:53:24.896624480Z" level=info msg="StartContainer for \"1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b\" returns successfully" Feb 13 19:53:26.357876 systemd[1]: cri-containerd-1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b.scope: Deactivated successfully. Feb 13 19:53:26.388501 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b-rootfs.mount: Deactivated successfully. Feb 13 19:53:26.401466 containerd[1536]: time="2025-02-13T19:53:26.401416264Z" level=info msg="shim disconnected" id=1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b namespace=k8s.io Feb 13 19:53:26.401789 containerd[1536]: time="2025-02-13T19:53:26.401525659Z" level=warning msg="cleaning up after shim disconnected" id=1858e646d2eca9e2eb7ce32da32b86f265d2b8d3e07f7bfc4b4a21fdca17aa0b namespace=k8s.io Feb 13 19:53:26.401789 containerd[1536]: time="2025-02-13T19:53:26.401534045Z" level=info msg="cleaning up dead shim" namespace=k8s.io Feb 13 19:53:26.451073 kubelet[2855]: I0213 19:53:26.451050 2855 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Feb 13 19:53:26.526586 kubelet[2855]: I0213 19:53:26.526031 2855 topology_manager.go:215] "Topology Admit Handler" podUID="e2117422-eaf7-4842-8ba3-82e570607dc9" podNamespace="kube-system" podName="coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:26.546511 kubelet[2855]: I0213 19:53:26.546448 2855 topology_manager.go:215] "Topology Admit Handler" podUID="9644224c-d01a-42e1-9f2e-df8377e29c31" podNamespace="kube-system" podName="coredns-7db6d8ff4d-rknrr" Feb 13 19:53:26.546702 kubelet[2855]: I0213 19:53:26.546687 2855 topology_manager.go:215] "Topology Admit Handler" podUID="9f58da82-3497-4839-8c99-878960089137" podNamespace="calico-system" podName="calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:26.547030 kubelet[2855]: I0213 19:53:26.546946 2855 topology_manager.go:215] "Topology Admit Handler" podUID="b7841c7a-d63f-4929-a2cc-5262cd7ba254" podNamespace="calico-apiserver" podName="calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:26.547030 kubelet[2855]: I0213 19:53:26.547026 2855 topology_manager.go:215] "Topology Admit Handler" podUID="9f3ee88c-5d3d-4b58-85b2-38498875ee0d" podNamespace="calico-apiserver" podName="calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:26.604540 systemd[1]: Created slice kubepods-burstable-pode2117422_eaf7_4842_8ba3_82e570607dc9.slice - libcontainer container kubepods-burstable-pode2117422_eaf7_4842_8ba3_82e570607dc9.slice. Feb 13 19:53:26.606840 kubelet[2855]: I0213 19:53:26.606698 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbbmv\" (UniqueName: \"kubernetes.io/projected/e2117422-eaf7-4842-8ba3-82e570607dc9-kube-api-access-lbbmv\") pod \"coredns-7db6d8ff4d-t5g9g\" (UID: \"e2117422-eaf7-4842-8ba3-82e570607dc9\") " pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:26.606840 kubelet[2855]: I0213 19:53:26.606720 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9f58da82-3497-4839-8c99-878960089137-tigera-ca-bundle\") pod \"calico-kube-controllers-855555987-xzhkk\" (UID: \"9f58da82-3497-4839-8c99-878960089137\") " pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:26.606840 kubelet[2855]: I0213 19:53:26.606731 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctlpj\" (UniqueName: \"kubernetes.io/projected/b7841c7a-d63f-4929-a2cc-5262cd7ba254-kube-api-access-ctlpj\") pod \"calico-apiserver-66fd84fdb4-2lrbj\" (UID: \"b7841c7a-d63f-4929-a2cc-5262cd7ba254\") " pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:26.606840 kubelet[2855]: I0213 19:53:26.606742 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgdbl\" (UniqueName: \"kubernetes.io/projected/9644224c-d01a-42e1-9f2e-df8377e29c31-kube-api-access-vgdbl\") pod \"coredns-7db6d8ff4d-rknrr\" (UID: \"9644224c-d01a-42e1-9f2e-df8377e29c31\") " pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:26.606840 kubelet[2855]: I0213 19:53:26.606754 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6w4b\" (UniqueName: \"kubernetes.io/projected/9f3ee88c-5d3d-4b58-85b2-38498875ee0d-kube-api-access-p6w4b\") pod \"calico-apiserver-66fd84fdb4-cwm4r\" (UID: \"9f3ee88c-5d3d-4b58-85b2-38498875ee0d\") " pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:26.607002 kubelet[2855]: I0213 19:53:26.606772 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9f3ee88c-5d3d-4b58-85b2-38498875ee0d-calico-apiserver-certs\") pod \"calico-apiserver-66fd84fdb4-cwm4r\" (UID: \"9f3ee88c-5d3d-4b58-85b2-38498875ee0d\") " pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:26.607002 kubelet[2855]: I0213 19:53:26.606782 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b7841c7a-d63f-4929-a2cc-5262cd7ba254-calico-apiserver-certs\") pod \"calico-apiserver-66fd84fdb4-2lrbj\" (UID: \"b7841c7a-d63f-4929-a2cc-5262cd7ba254\") " pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:26.607002 kubelet[2855]: I0213 19:53:26.606794 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzzqq\" (UniqueName: \"kubernetes.io/projected/9f58da82-3497-4839-8c99-878960089137-kube-api-access-nzzqq\") pod \"calico-kube-controllers-855555987-xzhkk\" (UID: \"9f58da82-3497-4839-8c99-878960089137\") " pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:26.607002 kubelet[2855]: I0213 19:53:26.606806 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2117422-eaf7-4842-8ba3-82e570607dc9-config-volume\") pod \"coredns-7db6d8ff4d-t5g9g\" (UID: \"e2117422-eaf7-4842-8ba3-82e570607dc9\") " pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:26.607002 kubelet[2855]: I0213 19:53:26.606815 2855 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9644224c-d01a-42e1-9f2e-df8377e29c31-config-volume\") pod \"coredns-7db6d8ff4d-rknrr\" (UID: \"9644224c-d01a-42e1-9f2e-df8377e29c31\") " pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:26.608336 systemd[1]: Created slice kubepods-burstable-pod9644224c_d01a_42e1_9f2e_df8377e29c31.slice - libcontainer container kubepods-burstable-pod9644224c_d01a_42e1_9f2e_df8377e29c31.slice. Feb 13 19:53:26.614413 systemd[1]: Created slice kubepods-besteffort-pod9f58da82_3497_4839_8c99_878960089137.slice - libcontainer container kubepods-besteffort-pod9f58da82_3497_4839_8c99_878960089137.slice. Feb 13 19:53:26.619899 systemd[1]: Created slice kubepods-besteffort-podb7841c7a_d63f_4929_a2cc_5262cd7ba254.slice - libcontainer container kubepods-besteffort-podb7841c7a_d63f_4929_a2cc_5262cd7ba254.slice. Feb 13 19:53:26.623590 systemd[1]: Created slice kubepods-besteffort-pod9f3ee88c_5d3d_4b58_85b2_38498875ee0d.slice - libcontainer container kubepods-besteffort-pod9f3ee88c_5d3d_4b58_85b2_38498875ee0d.slice. Feb 13 19:53:26.628794 systemd[1]: Created slice kubepods-besteffort-pod34af207b_f303_480e_a684_e96e850daafd.slice - libcontainer container kubepods-besteffort-pod34af207b_f303_480e_a684_e96e850daafd.slice. Feb 13 19:53:26.630625 containerd[1536]: time="2025-02-13T19:53:26.630530905Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:0,}" Feb 13 19:53:26.644023 containerd[1536]: time="2025-02-13T19:53:26.643839227Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Feb 13 19:53:26.909051 containerd[1536]: time="2025-02-13T19:53:26.909018782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:0,}" Feb 13 19:53:26.939666 containerd[1536]: time="2025-02-13T19:53:26.939641213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:0,}" Feb 13 19:53:26.940008 containerd[1536]: time="2025-02-13T19:53:26.939995805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:0,}" Feb 13 19:53:26.940358 containerd[1536]: time="2025-02-13T19:53:26.940341130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:0,}" Feb 13 19:53:26.941498 containerd[1536]: time="2025-02-13T19:53:26.941393051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:0,}" Feb 13 19:53:26.964644 containerd[1536]: time="2025-02-13T19:53:26.964600464Z" level=error msg="Failed to destroy network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.968184 containerd[1536]: time="2025-02-13T19:53:26.968128353Z" level=error msg="encountered an error cleaning up failed sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.968271 containerd[1536]: time="2025-02-13T19:53:26.968257131Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.976871 containerd[1536]: time="2025-02-13T19:53:26.976809206Z" level=error msg="Failed to destroy network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.977116 containerd[1536]: time="2025-02-13T19:53:26.977098683Z" level=error msg="encountered an error cleaning up failed sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.977193 containerd[1536]: time="2025-02-13T19:53:26.977138032Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.980279 kubelet[2855]: E0213 19:53:26.969051 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.980279 kubelet[2855]: E0213 19:53:26.979239 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:26.980279 kubelet[2855]: E0213 19:53:26.979258 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:26.980279 kubelet[2855]: E0213 19:53:26.978347 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:26.980407 kubelet[2855]: E0213 19:53:26.979293 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:26.980407 kubelet[2855]: E0213 19:53:26.979303 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:26.980407 kubelet[2855]: E0213 19:53:26.979332 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-t5g9g" podUID="e2117422-eaf7-4842-8ba3-82e570607dc9" Feb 13 19:53:26.980496 kubelet[2855]: E0213 19:53:26.979427 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:27.047761 containerd[1536]: time="2025-02-13T19:53:27.047730292Z" level=error msg="Failed to destroy network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.047868 containerd[1536]: time="2025-02-13T19:53:27.047730302Z" level=error msg="Failed to destroy network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.048472 containerd[1536]: time="2025-02-13T19:53:27.048212016Z" level=error msg="encountered an error cleaning up failed sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.048472 containerd[1536]: time="2025-02-13T19:53:27.048247399Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.048472 containerd[1536]: time="2025-02-13T19:53:27.048358309Z" level=error msg="encountered an error cleaning up failed sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.048472 containerd[1536]: time="2025-02-13T19:53:27.048379805Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.048607 kubelet[2855]: E0213 19:53:27.048490 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.048607 kubelet[2855]: E0213 19:53:27.048525 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:27.048607 kubelet[2855]: E0213 19:53:27.048538 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:27.049573 kubelet[2855]: E0213 19:53:27.048567 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-rknrr" podUID="9644224c-d01a-42e1-9f2e-df8377e29c31" Feb 13 19:53:27.049573 kubelet[2855]: E0213 19:53:27.048706 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.049573 kubelet[2855]: E0213 19:53:27.048719 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:27.049946 kubelet[2855]: E0213 19:53:27.048729 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:27.049946 kubelet[2855]: E0213 19:53:27.048742 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" podUID="b7841c7a-d63f-4929-a2cc-5262cd7ba254" Feb 13 19:53:27.056078 containerd[1536]: time="2025-02-13T19:53:27.056013257Z" level=error msg="Failed to destroy network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.056258 containerd[1536]: time="2025-02-13T19:53:27.056239723Z" level=error msg="encountered an error cleaning up failed sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.056306 containerd[1536]: time="2025-02-13T19:53:27.056288679Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.056626 kubelet[2855]: E0213 19:53:27.056413 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.056626 kubelet[2855]: E0213 19:53:27.056448 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:27.056626 kubelet[2855]: E0213 19:53:27.056460 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:27.056716 kubelet[2855]: E0213 19:53:27.056484 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" podUID="9f3ee88c-5d3d-4b58-85b2-38498875ee0d" Feb 13 19:53:27.060598 containerd[1536]: time="2025-02-13T19:53:27.060543090Z" level=error msg="Failed to destroy network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.060956 containerd[1536]: time="2025-02-13T19:53:27.060865810Z" level=error msg="encountered an error cleaning up failed sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.060956 containerd[1536]: time="2025-02-13T19:53:27.060902186Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.061102 kubelet[2855]: E0213 19:53:27.061070 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.061137 kubelet[2855]: E0213 19:53:27.061119 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:27.061170 kubelet[2855]: E0213 19:53:27.061138 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:27.061192 kubelet[2855]: E0213 19:53:27.061169 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-855555987-xzhkk" podUID="9f58da82-3497-4839-8c99-878960089137" Feb 13 19:53:27.392177 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a-shm.mount: Deactivated successfully. Feb 13 19:53:27.654765 kubelet[2855]: I0213 19:53:27.644965 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772" Feb 13 19:53:27.655933 kubelet[2855]: I0213 19:53:27.655917 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2" Feb 13 19:53:27.672864 kubelet[2855]: I0213 19:53:27.672576 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a" Feb 13 19:53:27.673228 kubelet[2855]: I0213 19:53:27.673213 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811" Feb 13 19:53:27.673762 kubelet[2855]: I0213 19:53:27.673749 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5" Feb 13 19:53:27.674276 kubelet[2855]: I0213 19:53:27.674263 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a" Feb 13 19:53:27.738432 containerd[1536]: time="2025-02-13T19:53:27.738369132Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:27.738432 containerd[1536]: time="2025-02-13T19:53:27.738391275Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:27.743436 containerd[1536]: time="2025-02-13T19:53:27.741440705Z" level=info msg="Ensure that sandbox 113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a in task-service has been cleanup successfully" Feb 13 19:53:27.743436 containerd[1536]: time="2025-02-13T19:53:27.741584548Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:27.743436 containerd[1536]: time="2025-02-13T19:53:27.741593254Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:27.743436 containerd[1536]: time="2025-02-13T19:53:27.741646202Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:27.743436 containerd[1536]: time="2025-02-13T19:53:27.741724689Z" level=info msg="Ensure that sandbox 3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811 in task-service has been cleanup successfully" Feb 13 19:53:27.743436 containerd[1536]: time="2025-02-13T19:53:27.741966820Z" level=info msg="Ensure that sandbox a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2 in task-service has been cleanup successfully" Feb 13 19:53:27.743235 systemd[1]: run-netns-cni\x2d515a0bdd\x2d1743\x2d6d3a\x2de3c0\x2d4f0430dcb1bc.mount: Deactivated successfully. Feb 13 19:53:27.743614 containerd[1536]: time="2025-02-13T19:53:27.743456824Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:27.743614 containerd[1536]: time="2025-02-13T19:53:27.743472127Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:27.743614 containerd[1536]: time="2025-02-13T19:53:27.743515468Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:27.743288 systemd[1]: run-netns-cni\x2d02baebf0\x2d3064\x2d9bbe\x2d30d9\x2d71a8e835f4b5.mount: Deactivated successfully. Feb 13 19:53:27.743699 containerd[1536]: time="2025-02-13T19:53:27.743613917Z" level=info msg="Ensure that sandbox ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5 in task-service has been cleanup successfully" Feb 13 19:53:27.743323 systemd[1]: run-netns-cni\x2d67739507\x2d9177\x2ddd55\x2dbcd0\x2da8c784b3a4b9.mount: Deactivated successfully. Feb 13 19:53:27.743992 containerd[1536]: time="2025-02-13T19:53:27.743862754Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:27.743992 containerd[1536]: time="2025-02-13T19:53:27.743900094Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:27.743992 containerd[1536]: time="2025-02-13T19:53:27.743912963Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:27.744165 containerd[1536]: time="2025-02-13T19:53:27.738374803Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:27.744165 containerd[1536]: time="2025-02-13T19:53:27.744138496Z" level=info msg="Ensure that sandbox 2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a in task-service has been cleanup successfully" Feb 13 19:53:27.744301 containerd[1536]: time="2025-02-13T19:53:27.744224193Z" level=info msg="Ensure that sandbox d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772 in task-service has been cleanup successfully" Feb 13 19:53:27.744301 containerd[1536]: time="2025-02-13T19:53:27.744226562Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:27.744301 containerd[1536]: time="2025-02-13T19:53:27.744249833Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:27.745065 containerd[1536]: time="2025-02-13T19:53:27.744510357Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:27.745065 containerd[1536]: time="2025-02-13T19:53:27.744520506Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:27.745995 containerd[1536]: time="2025-02-13T19:53:27.745135135Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:27.745995 containerd[1536]: time="2025-02-13T19:53:27.745145328Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:27.747970 systemd[1]: run-netns-cni\x2dd6dd045b\x2d7536\x2debda\x2dcbc3\x2dd233ecbca51d.mount: Deactivated successfully. Feb 13 19:53:27.748074 systemd[1]: run-netns-cni\x2d65d98e0e\x2d76fb\x2d2ad6\x2da417\x2d7c1e4fd613d0.mount: Deactivated successfully. Feb 13 19:53:27.748116 systemd[1]: run-netns-cni\x2df57d205f\x2d99d6\x2d2484\x2d2507\x2d4146fe1f727f.mount: Deactivated successfully. Feb 13 19:53:27.750500 containerd[1536]: time="2025-02-13T19:53:27.750479177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:1,}" Feb 13 19:53:27.750881 containerd[1536]: time="2025-02-13T19:53:27.750550915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:1,}" Feb 13 19:53:27.750881 containerd[1536]: time="2025-02-13T19:53:27.750660279Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:1,}" Feb 13 19:53:27.750881 containerd[1536]: time="2025-02-13T19:53:27.750770106Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:1,}" Feb 13 19:53:27.751419 containerd[1536]: time="2025-02-13T19:53:27.751285314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:1,}" Feb 13 19:53:27.751626 containerd[1536]: time="2025-02-13T19:53:27.751614448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:1,}" Feb 13 19:53:27.846165 containerd[1536]: time="2025-02-13T19:53:27.846074774Z" level=error msg="Failed to destroy network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.846924 containerd[1536]: time="2025-02-13T19:53:27.846411044Z" level=error msg="encountered an error cleaning up failed sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.846924 containerd[1536]: time="2025-02-13T19:53:27.846895649Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.847960 kubelet[2855]: E0213 19:53:27.847716 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.847960 kubelet[2855]: E0213 19:53:27.847754 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:27.847960 kubelet[2855]: E0213 19:53:27.847766 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:27.848215 kubelet[2855]: E0213 19:53:27.847800 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:27.869436 containerd[1536]: time="2025-02-13T19:53:27.869269736Z" level=error msg="Failed to destroy network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.869677 containerd[1536]: time="2025-02-13T19:53:27.869589413Z" level=error msg="encountered an error cleaning up failed sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.869677 containerd[1536]: time="2025-02-13T19:53:27.869651816Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.870757 kubelet[2855]: E0213 19:53:27.870146 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.870757 kubelet[2855]: E0213 19:53:27.870195 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:27.870757 kubelet[2855]: E0213 19:53:27.870212 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:27.870833 kubelet[2855]: E0213 19:53:27.870242 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" podUID="9f3ee88c-5d3d-4b58-85b2-38498875ee0d" Feb 13 19:53:27.878006 containerd[1536]: time="2025-02-13T19:53:27.877902902Z" level=error msg="Failed to destroy network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.878271 containerd[1536]: time="2025-02-13T19:53:27.878160397Z" level=error msg="encountered an error cleaning up failed sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.878271 containerd[1536]: time="2025-02-13T19:53:27.878197773Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.878359 kubelet[2855]: E0213 19:53:27.878338 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.879116 kubelet[2855]: E0213 19:53:27.878389 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:27.879116 kubelet[2855]: E0213 19:53:27.878409 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:27.879116 kubelet[2855]: E0213 19:53:27.878437 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-t5g9g" podUID="e2117422-eaf7-4842-8ba3-82e570607dc9" Feb 13 19:53:27.879345 containerd[1536]: time="2025-02-13T19:53:27.879330293Z" level=error msg="Failed to destroy network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.879628 containerd[1536]: time="2025-02-13T19:53:27.879614696Z" level=error msg="encountered an error cleaning up failed sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.883121 containerd[1536]: time="2025-02-13T19:53:27.883102238Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.883972 kubelet[2855]: E0213 19:53:27.883804 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.883972 kubelet[2855]: E0213 19:53:27.883833 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:27.883972 kubelet[2855]: E0213 19:53:27.883845 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:27.884065 kubelet[2855]: E0213 19:53:27.883869 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" podUID="b7841c7a-d63f-4929-a2cc-5262cd7ba254" Feb 13 19:53:27.890321 containerd[1536]: time="2025-02-13T19:53:27.890291873Z" level=error msg="Failed to destroy network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.890570 containerd[1536]: time="2025-02-13T19:53:27.890557087Z" level=error msg="encountered an error cleaning up failed sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.890644 containerd[1536]: time="2025-02-13T19:53:27.890632857Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.891016 kubelet[2855]: E0213 19:53:27.890789 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.891016 kubelet[2855]: E0213 19:53:27.890828 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:27.891016 kubelet[2855]: E0213 19:53:27.890839 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:27.891737 kubelet[2855]: E0213 19:53:27.890868 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-rknrr" podUID="9644224c-d01a-42e1-9f2e-df8377e29c31" Feb 13 19:53:27.891873 containerd[1536]: time="2025-02-13T19:53:27.891860122Z" level=error msg="Failed to destroy network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.892077 containerd[1536]: time="2025-02-13T19:53:27.892063971Z" level=error msg="encountered an error cleaning up failed sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.892158 containerd[1536]: time="2025-02-13T19:53:27.892145371Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.892307 kubelet[2855]: E0213 19:53:27.892281 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:27.892336 kubelet[2855]: E0213 19:53:27.892313 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:27.892336 kubelet[2855]: E0213 19:53:27.892326 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:27.892395 kubelet[2855]: E0213 19:53:27.892350 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-855555987-xzhkk" podUID="9f58da82-3497-4839-8c99-878960089137" Feb 13 19:53:28.391046 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a-shm.mount: Deactivated successfully. Feb 13 19:53:28.676552 kubelet[2855]: I0213 19:53:28.676475 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f" Feb 13 19:53:28.677223 containerd[1536]: time="2025-02-13T19:53:28.677103446Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:28.677553 containerd[1536]: time="2025-02-13T19:53:28.677530617Z" level=info msg="Ensure that sandbox c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f in task-service has been cleanup successfully" Feb 13 19:53:28.681278 kubelet[2855]: I0213 19:53:28.677787 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a" Feb 13 19:53:28.679247 systemd[1]: run-netns-cni\x2d41f7ea12\x2d9d90\x2d439f\x2d5637\x2df40600221766.mount: Deactivated successfully. Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.677867398Z" level=info msg="TearDown network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.677879238Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" returns successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678197116Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678320380Z" level=info msg="Ensure that sandbox cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a in task-service has been cleanup successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678459428Z" level=info msg="TearDown network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678470118Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" returns successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678708620Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678836761Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678847378Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678914268Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678979655Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.678995325Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:28.681470 containerd[1536]: time="2025-02-13T19:53:28.679429664Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:2,}" Feb 13 19:53:28.682193 containerd[1536]: time="2025-02-13T19:53:28.681965198Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:2,}" Feb 13 19:53:28.684320 kubelet[2855]: I0213 19:53:28.683753 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0" Feb 13 19:53:28.684458 systemd[1]: run-netns-cni\x2d131b40c3\x2d63b3\x2d0f82\x2d9240\x2d475836f5c719.mount: Deactivated successfully. Feb 13 19:53:28.684723 containerd[1536]: time="2025-02-13T19:53:28.684474853Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:28.684723 containerd[1536]: time="2025-02-13T19:53:28.684591050Z" level=info msg="Ensure that sandbox b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0 in task-service has been cleanup successfully" Feb 13 19:53:28.684945 containerd[1536]: time="2025-02-13T19:53:28.684930874Z" level=info msg="TearDown network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" successfully" Feb 13 19:53:28.685024 containerd[1536]: time="2025-02-13T19:53:28.685013823Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" returns successfully" Feb 13 19:53:28.686961 containerd[1536]: time="2025-02-13T19:53:28.686837738Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:28.686961 containerd[1536]: time="2025-02-13T19:53:28.686891824Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:28.686961 containerd[1536]: time="2025-02-13T19:53:28.686901195Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:28.687373 containerd[1536]: time="2025-02-13T19:53:28.687359651Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:2,}" Feb 13 19:53:28.687427 systemd[1]: run-netns-cni\x2d71eb44d2\x2da994\x2d3ef0\x2d80e7\x2dc0508339f64a.mount: Deactivated successfully. Feb 13 19:53:28.687853 kubelet[2855]: I0213 19:53:28.687484 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f" Feb 13 19:53:28.688254 containerd[1536]: time="2025-02-13T19:53:28.688227918Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:28.688392 containerd[1536]: time="2025-02-13T19:53:28.688350342Z" level=info msg="Ensure that sandbox 9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f in task-service has been cleanup successfully" Feb 13 19:53:28.691012 containerd[1536]: time="2025-02-13T19:53:28.690095674Z" level=info msg="TearDown network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" successfully" Feb 13 19:53:28.691012 containerd[1536]: time="2025-02-13T19:53:28.690110592Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" returns successfully" Feb 13 19:53:28.691012 containerd[1536]: time="2025-02-13T19:53:28.690390420Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:28.691012 containerd[1536]: time="2025-02-13T19:53:28.690498336Z" level=info msg="Ensure that sandbox 787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621 in task-service has been cleanup successfully" Feb 13 19:53:28.691012 containerd[1536]: time="2025-02-13T19:53:28.690979255Z" level=info msg="TearDown network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" successfully" Feb 13 19:53:28.691012 containerd[1536]: time="2025-02-13T19:53:28.691005681Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" returns successfully" Feb 13 19:53:28.690820 systemd[1]: run-netns-cni\x2d975496c3\x2d14b5\x2dfe3b\x2db2b8\x2d2e567d7e12b6.mount: Deactivated successfully. Feb 13 19:53:28.691249 kubelet[2855]: I0213 19:53:28.690140 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621" Feb 13 19:53:28.691278 containerd[1536]: time="2025-02-13T19:53:28.691122447Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:28.691278 containerd[1536]: time="2025-02-13T19:53:28.691169308Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:28.691278 containerd[1536]: time="2025-02-13T19:53:28.691176900Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:28.693028 containerd[1536]: time="2025-02-13T19:53:28.692585557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:2,}" Feb 13 19:53:28.693028 containerd[1536]: time="2025-02-13T19:53:28.692705205Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:28.693028 containerd[1536]: time="2025-02-13T19:53:28.692748317Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:28.693028 containerd[1536]: time="2025-02-13T19:53:28.692755823Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:28.693028 containerd[1536]: time="2025-02-13T19:53:28.693005672Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:2,}" Feb 13 19:53:28.693437 kubelet[2855]: I0213 19:53:28.693418 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450" Feb 13 19:53:28.693644 containerd[1536]: time="2025-02-13T19:53:28.693625933Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:28.693791 containerd[1536]: time="2025-02-13T19:53:28.693770794Z" level=info msg="Ensure that sandbox 3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450 in task-service has been cleanup successfully" Feb 13 19:53:28.693962 containerd[1536]: time="2025-02-13T19:53:28.693945087Z" level=info msg="TearDown network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" successfully" Feb 13 19:53:28.693962 containerd[1536]: time="2025-02-13T19:53:28.693958709Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" returns successfully" Feb 13 19:53:28.694140 containerd[1536]: time="2025-02-13T19:53:28.694122786Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:28.694182 containerd[1536]: time="2025-02-13T19:53:28.694168876Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:28.694182 containerd[1536]: time="2025-02-13T19:53:28.694180782Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:28.694539 containerd[1536]: time="2025-02-13T19:53:28.694476902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:2,}" Feb 13 19:53:29.163374 containerd[1536]: time="2025-02-13T19:53:29.163337375Z" level=error msg="Failed to destroy network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.163998 containerd[1536]: time="2025-02-13T19:53:29.163762040Z" level=error msg="encountered an error cleaning up failed sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.163998 containerd[1536]: time="2025-02-13T19:53:29.163968386Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.164531 kubelet[2855]: E0213 19:53:29.164302 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.164531 kubelet[2855]: E0213 19:53:29.164344 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:29.164531 kubelet[2855]: E0213 19:53:29.164365 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:29.164638 kubelet[2855]: E0213 19:53:29.164394 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:29.194850 containerd[1536]: time="2025-02-13T19:53:29.194816053Z" level=error msg="Failed to destroy network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.195101 containerd[1536]: time="2025-02-13T19:53:29.195083020Z" level=error msg="encountered an error cleaning up failed sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.195141 containerd[1536]: time="2025-02-13T19:53:29.195121744Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.195485 kubelet[2855]: E0213 19:53:29.195272 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.195485 kubelet[2855]: E0213 19:53:29.195306 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:29.195485 kubelet[2855]: E0213 19:53:29.195319 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:29.195562 kubelet[2855]: E0213 19:53:29.195350 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" podUID="9f3ee88c-5d3d-4b58-85b2-38498875ee0d" Feb 13 19:53:29.215314 containerd[1536]: time="2025-02-13T19:53:29.215284667Z" level=error msg="Failed to destroy network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.216154 containerd[1536]: time="2025-02-13T19:53:29.215699701Z" level=error msg="encountered an error cleaning up failed sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.216154 containerd[1536]: time="2025-02-13T19:53:29.215740144Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.216483 kubelet[2855]: E0213 19:53:29.216206 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.216483 kubelet[2855]: E0213 19:53:29.216241 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:29.216483 kubelet[2855]: E0213 19:53:29.216253 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:29.216569 kubelet[2855]: E0213 19:53:29.216287 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-t5g9g" podUID="e2117422-eaf7-4842-8ba3-82e570607dc9" Feb 13 19:53:29.217043 containerd[1536]: time="2025-02-13T19:53:29.217029155Z" level=error msg="Failed to destroy network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.218242 containerd[1536]: time="2025-02-13T19:53:29.218165076Z" level=error msg="encountered an error cleaning up failed sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.218497 containerd[1536]: time="2025-02-13T19:53:29.218451799Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.218753 kubelet[2855]: E0213 19:53:29.218649 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.218753 kubelet[2855]: E0213 19:53:29.218682 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:29.218753 kubelet[2855]: E0213 19:53:29.218695 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:29.218835 kubelet[2855]: E0213 19:53:29.218719 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-rknrr" podUID="9644224c-d01a-42e1-9f2e-df8377e29c31" Feb 13 19:53:29.232246 containerd[1536]: time="2025-02-13T19:53:29.232079665Z" level=error msg="Failed to destroy network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.232334 containerd[1536]: time="2025-02-13T19:53:29.232285733Z" level=error msg="encountered an error cleaning up failed sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.232334 containerd[1536]: time="2025-02-13T19:53:29.232321636Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.232689 kubelet[2855]: E0213 19:53:29.232442 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.232689 kubelet[2855]: E0213 19:53:29.232475 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:29.232689 kubelet[2855]: E0213 19:53:29.232487 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:29.232762 kubelet[2855]: E0213 19:53:29.232515 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" podUID="b7841c7a-d63f-4929-a2cc-5262cd7ba254" Feb 13 19:53:29.241144 containerd[1536]: time="2025-02-13T19:53:29.241027715Z" level=error msg="Failed to destroy network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.241503 containerd[1536]: time="2025-02-13T19:53:29.241338743Z" level=error msg="encountered an error cleaning up failed sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.241503 containerd[1536]: time="2025-02-13T19:53:29.241420611Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.241852 kubelet[2855]: E0213 19:53:29.241646 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.241852 kubelet[2855]: E0213 19:53:29.241687 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:29.241852 kubelet[2855]: E0213 19:53:29.241701 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:29.241930 kubelet[2855]: E0213 19:53:29.241729 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-855555987-xzhkk" podUID="9f58da82-3497-4839-8c99-878960089137" Feb 13 19:53:29.392813 systemd[1]: run-netns-cni\x2d45b8af6d\x2dfb97\x2dce1b\x2d78c3\x2ddb45597de6ab.mount: Deactivated successfully. Feb 13 19:53:29.392870 systemd[1]: run-netns-cni\x2daf91d010\x2d637e\x2df393\x2d32e4\x2d2c9488f91275.mount: Deactivated successfully. Feb 13 19:53:29.696141 kubelet[2855]: I0213 19:53:29.696125 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166" Feb 13 19:53:29.696845 containerd[1536]: time="2025-02-13T19:53:29.696698172Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" Feb 13 19:53:29.697770 containerd[1536]: time="2025-02-13T19:53:29.697754663Z" level=info msg="Ensure that sandbox 3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166 in task-service has been cleanup successfully" Feb 13 19:53:29.698034 containerd[1536]: time="2025-02-13T19:53:29.697866404Z" level=info msg="TearDown network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" successfully" Feb 13 19:53:29.698034 containerd[1536]: time="2025-02-13T19:53:29.697876818Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.698588661Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.698629106Z" level=info msg="TearDown network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.698635575Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.698839306Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.699135159Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.699142702Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.699196240Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.699276948Z" level=info msg="Ensure that sandbox d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6 in task-service has been cleanup successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.699559054Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:3,}" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.699782178Z" level=info msg="TearDown network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.699790348Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.700194524Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.700231361Z" level=info msg="TearDown network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.700237286Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.700420646Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.700468117Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.700475588Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.700695109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:3,}" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701306500Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701388024Z" level=info msg="Ensure that sandbox 066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95 in task-service has been cleanup successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701481907Z" level=info msg="TearDown network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701489307Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701688025Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701736708Z" level=info msg="TearDown network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701744250Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" returns successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701922720Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701954842Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:29.702305 containerd[1536]: time="2025-02-13T19:53:29.701975381Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:29.699664 systemd[1]: run-netns-cni\x2d26dcb132\x2d91d7\x2d9762\x2d683d\x2d6d2577038a07.mount: Deactivated successfully. Feb 13 19:53:29.703210 kubelet[2855]: I0213 19:53:29.698925 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6" Feb 13 19:53:29.703210 kubelet[2855]: I0213 19:53:29.701110 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95" Feb 13 19:53:29.703210 kubelet[2855]: I0213 19:53:29.702746 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c" Feb 13 19:53:29.703272 containerd[1536]: time="2025-02-13T19:53:29.702323564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:3,}" Feb 13 19:53:29.703926 containerd[1536]: time="2025-02-13T19:53:29.703293742Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" Feb 13 19:53:29.703926 containerd[1536]: time="2025-02-13T19:53:29.703433502Z" level=info msg="Ensure that sandbox 1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c in task-service has been cleanup successfully" Feb 13 19:53:29.703926 containerd[1536]: time="2025-02-13T19:53:29.703527197Z" level=info msg="TearDown network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" successfully" Feb 13 19:53:29.703926 containerd[1536]: time="2025-02-13T19:53:29.703535600Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" returns successfully" Feb 13 19:53:29.703926 containerd[1536]: time="2025-02-13T19:53:29.703892601Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:29.705115 containerd[1536]: time="2025-02-13T19:53:29.704199039Z" level=info msg="TearDown network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" successfully" Feb 13 19:53:29.705115 containerd[1536]: time="2025-02-13T19:53:29.704207118Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" returns successfully" Feb 13 19:53:29.705115 containerd[1536]: time="2025-02-13T19:53:29.704369536Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:29.705115 containerd[1536]: time="2025-02-13T19:53:29.704502288Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:29.705115 containerd[1536]: time="2025-02-13T19:53:29.704509749Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:29.705227 containerd[1536]: time="2025-02-13T19:53:29.705182946Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:3,}" Feb 13 19:53:29.705613 kubelet[2855]: I0213 19:53:29.705526 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c" Feb 13 19:53:29.705888 containerd[1536]: time="2025-02-13T19:53:29.705873447Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.705965822Z" level=info msg="Ensure that sandbox 3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c in task-service has been cleanup successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.706467675Z" level=info msg="TearDown network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.706475984Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" returns successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.706756497Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.706915578Z" level=info msg="TearDown network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.706923466Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" returns successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707306696Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707406248Z" level=info msg="Ensure that sandbox 36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec in task-service has been cleanup successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707547533Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707672883Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707684057Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707862091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:3,}" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707913319Z" level=info msg="TearDown network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.707920632Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" returns successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.708116332Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.708241724Z" level=info msg="TearDown network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.708248955Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" returns successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.708473758Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.708612028Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:29.708994 containerd[1536]: time="2025-02-13T19:53:29.708639389Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:29.709354 kubelet[2855]: I0213 19:53:29.707126 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec" Feb 13 19:53:29.706702 systemd[1]: run-netns-cni\x2d9f85f7ff\x2d8fb2\x2df529\x2d38e9\x2d61c8cbe65a85.mount: Deactivated successfully. Feb 13 19:53:29.709460 containerd[1536]: time="2025-02-13T19:53:29.708955354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:3,}" Feb 13 19:53:29.706755 systemd[1]: run-netns-cni\x2d5fdac3c7\x2d3799\x2d3e47\x2d92af\x2d020515e97027.mount: Deactivated successfully. Feb 13 19:53:29.706791 systemd[1]: run-netns-cni\x2dbfd58b26\x2d8297\x2dd1b7\x2d9801\x2d2ca55618d251.mount: Deactivated successfully. Feb 13 19:53:29.712547 systemd[1]: run-netns-cni\x2d206d79fa\x2dd50f\x2d3e95\x2d73f7\x2d5ce7f9e6e337.mount: Deactivated successfully. Feb 13 19:53:29.712653 systemd[1]: run-netns-cni\x2d9689e651\x2dcd2d\x2daab8\x2da28b\x2d5b9eecaefff5.mount: Deactivated successfully. Feb 13 19:53:29.887920 containerd[1536]: time="2025-02-13T19:53:29.887836972Z" level=error msg="Failed to destroy network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.888389 containerd[1536]: time="2025-02-13T19:53:29.888211825Z" level=error msg="encountered an error cleaning up failed sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.888389 containerd[1536]: time="2025-02-13T19:53:29.888255694Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.888389 containerd[1536]: time="2025-02-13T19:53:29.888337423Z" level=error msg="Failed to destroy network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.888599 containerd[1536]: time="2025-02-13T19:53:29.888583208Z" level=error msg="encountered an error cleaning up failed sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.888662 containerd[1536]: time="2025-02-13T19:53:29.888650901Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.888775 containerd[1536]: time="2025-02-13T19:53:29.888763511Z" level=error msg="Failed to destroy network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.888967 containerd[1536]: time="2025-02-13T19:53:29.888954393Z" level=error msg="encountered an error cleaning up failed sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.889052 containerd[1536]: time="2025-02-13T19:53:29.889039149Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.889159 containerd[1536]: time="2025-02-13T19:53:29.889146060Z" level=error msg="Failed to destroy network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.889484 containerd[1536]: time="2025-02-13T19:53:29.889329734Z" level=error msg="encountered an error cleaning up failed sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.889484 containerd[1536]: time="2025-02-13T19:53:29.889351906Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.896586 containerd[1536]: time="2025-02-13T19:53:29.896492477Z" level=error msg="Failed to destroy network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.896779 kubelet[2855]: E0213 19:53:29.896750 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.896831 kubelet[2855]: E0213 19:53:29.896790 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:29.896831 kubelet[2855]: E0213 19:53:29.896805 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:29.896877 kubelet[2855]: E0213 19:53:29.896829 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-rknrr" podUID="9644224c-d01a-42e1-9f2e-df8377e29c31" Feb 13 19:53:29.897316 kubelet[2855]: E0213 19:53:29.897027 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.897316 kubelet[2855]: E0213 19:53:29.897048 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:29.897316 kubelet[2855]: E0213 19:53:29.896754 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.897316 kubelet[2855]: E0213 19:53:29.897083 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:29.897419 kubelet[2855]: E0213 19:53:29.897110 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:29.897419 kubelet[2855]: E0213 19:53:29.897129 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-t5g9g" podUID="e2117422-eaf7-4842-8ba3-82e570607dc9" Feb 13 19:53:29.897419 kubelet[2855]: E0213 19:53:29.897243 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:29.897544 kubelet[2855]: E0213 19:53:29.897266 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" podUID="9f3ee88c-5d3d-4b58-85b2-38498875ee0d" Feb 13 19:53:29.897544 kubelet[2855]: E0213 19:53:29.897286 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.897544 kubelet[2855]: E0213 19:53:29.897297 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:29.897633 containerd[1536]: time="2025-02-13T19:53:29.897461134Z" level=error msg="encountered an error cleaning up failed sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.897633 containerd[1536]: time="2025-02-13T19:53:29.897496268Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.898599 kubelet[2855]: E0213 19:53:29.897306 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:29.898599 kubelet[2855]: E0213 19:53:29.898277 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" podUID="b7841c7a-d63f-4929-a2cc-5262cd7ba254" Feb 13 19:53:29.898965 kubelet[2855]: E0213 19:53:29.898882 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.898965 kubelet[2855]: E0213 19:53:29.898904 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:29.898965 kubelet[2855]: E0213 19:53:29.898920 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:29.899469 kubelet[2855]: E0213 19:53:29.898937 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-855555987-xzhkk" podUID="9f58da82-3497-4839-8c99-878960089137" Feb 13 19:53:29.918856 containerd[1536]: time="2025-02-13T19:53:29.918823252Z" level=error msg="Failed to destroy network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.919119 containerd[1536]: time="2025-02-13T19:53:29.919099968Z" level=error msg="encountered an error cleaning up failed sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.919576 containerd[1536]: time="2025-02-13T19:53:29.919330035Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.919611 kubelet[2855]: E0213 19:53:29.919466 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:29.919611 kubelet[2855]: E0213 19:53:29.919513 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:29.919611 kubelet[2855]: E0213 19:53:29.919527 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:29.919681 kubelet[2855]: E0213 19:53:29.919554 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:30.394462 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c-shm.mount: Deactivated successfully. Feb 13 19:53:30.710960 kubelet[2855]: I0213 19:53:30.710530 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03" Feb 13 19:53:30.711484 containerd[1536]: time="2025-02-13T19:53:30.711044394Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\"" Feb 13 19:53:30.732867 containerd[1536]: time="2025-02-13T19:53:30.731016007Z" level=info msg="Ensure that sandbox ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03 in task-service has been cleanup successfully" Feb 13 19:53:30.732451 systemd[1]: run-netns-cni\x2d89258b81\x2d9d43\x2d7aaa\x2db72b\x2df377e595109c.mount: Deactivated successfully. Feb 13 19:53:30.733239 containerd[1536]: time="2025-02-13T19:53:30.733162813Z" level=info msg="TearDown network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" successfully" Feb 13 19:53:30.733239 containerd[1536]: time="2025-02-13T19:53:30.733174332Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" returns successfully" Feb 13 19:53:30.734892 containerd[1536]: time="2025-02-13T19:53:30.733961396Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" Feb 13 19:53:30.734892 containerd[1536]: time="2025-02-13T19:53:30.734017771Z" level=info msg="TearDown network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" successfully" Feb 13 19:53:30.734892 containerd[1536]: time="2025-02-13T19:53:30.734026704Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" returns successfully" Feb 13 19:53:30.734892 containerd[1536]: time="2025-02-13T19:53:30.734268975Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:30.746680 containerd[1536]: time="2025-02-13T19:53:30.734318452Z" level=info msg="TearDown network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" successfully" Feb 13 19:53:30.746680 containerd[1536]: time="2025-02-13T19:53:30.746615804Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" returns successfully" Feb 13 19:53:30.756253 containerd[1536]: time="2025-02-13T19:53:30.746861318Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:30.756253 containerd[1536]: time="2025-02-13T19:53:30.746902421Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:30.756253 containerd[1536]: time="2025-02-13T19:53:30.746908253Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:30.756253 containerd[1536]: time="2025-02-13T19:53:30.747364430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:4,}" Feb 13 19:53:30.804867 kubelet[2855]: I0213 19:53:30.804743 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf" Feb 13 19:53:30.805488 containerd[1536]: time="2025-02-13T19:53:30.805468557Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\"" Feb 13 19:53:30.805678 containerd[1536]: time="2025-02-13T19:53:30.805584614Z" level=info msg="Ensure that sandbox 3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf in task-service has been cleanup successfully" Feb 13 19:53:30.808312 systemd[1]: run-netns-cni\x2d9537d33e\x2d763c\x2d749e\x2d8394\x2df31a50f611bb.mount: Deactivated successfully. Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.813333795Z" level=info msg="TearDown network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.813348729Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" returns successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.814309437Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.814350111Z" level=info msg="TearDown network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.814356178Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" returns successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.815024786Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.815058596Z" level=info msg="TearDown network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.815064269Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" returns successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.817336884Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.817375224Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.817380957Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:30.820103 containerd[1536]: time="2025-02-13T19:53:30.817753326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:4,}" Feb 13 19:53:30.835948 kubelet[2855]: I0213 19:53:30.835936 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c" Feb 13 19:53:30.837467 containerd[1536]: time="2025-02-13T19:53:30.837266031Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\"" Feb 13 19:53:30.841438 containerd[1536]: time="2025-02-13T19:53:30.841375804Z" level=info msg="Ensure that sandbox 546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c in task-service has been cleanup successfully" Feb 13 19:53:30.842775 systemd[1]: run-netns-cni\x2df218ad29\x2da160\x2d1f0f\x2dde18\x2d02dca3d8bd17.mount: Deactivated successfully. Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.842811019Z" level=info msg="TearDown network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" successfully" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.842822455Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" returns successfully" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.843495571Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.843534839Z" level=info msg="TearDown network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" successfully" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.843540980Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" returns successfully" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.844672708Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.844779939Z" level=info msg="TearDown network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" successfully" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.844788448Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" returns successfully" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.844999477Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.845048918Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:30.849482 containerd[1536]: time="2025-02-13T19:53:30.845057221Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:30.891948 kubelet[2855]: I0213 19:53:30.891865 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8" Feb 13 19:53:30.942054 containerd[1536]: time="2025-02-13T19:53:30.941963647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:4,}" Feb 13 19:53:30.942291 containerd[1536]: time="2025-02-13T19:53:30.942274590Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\"" Feb 13 19:53:30.942372 containerd[1536]: time="2025-02-13T19:53:30.942358746Z" level=info msg="Ensure that sandbox b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8 in task-service has been cleanup successfully" Feb 13 19:53:30.942972 containerd[1536]: time="2025-02-13T19:53:30.942514424Z" level=info msg="TearDown network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" successfully" Feb 13 19:53:30.942972 containerd[1536]: time="2025-02-13T19:53:30.942535269Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" returns successfully" Feb 13 19:53:30.988886 containerd[1536]: time="2025-02-13T19:53:30.988826092Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" Feb 13 19:53:30.995277 containerd[1536]: time="2025-02-13T19:53:30.989717061Z" level=info msg="TearDown network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" successfully" Feb 13 19:53:30.995277 containerd[1536]: time="2025-02-13T19:53:30.989728073Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" returns successfully" Feb 13 19:53:30.995277 containerd[1536]: time="2025-02-13T19:53:30.994469053Z" level=error msg="Failed to destroy network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:30.995277 containerd[1536]: time="2025-02-13T19:53:30.994930578Z" level=error msg="encountered an error cleaning up failed sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:30.995277 containerd[1536]: time="2025-02-13T19:53:30.995087925Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.023104 kubelet[2855]: E0213 19:53:31.023081 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.023233 kubelet[2855]: E0213 19:53:31.023218 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:31.023293 kubelet[2855]: E0213 19:53:31.023282 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:31.023425 kubelet[2855]: E0213 19:53:31.023406 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-rknrr" podUID="9644224c-d01a-42e1-9f2e-df8377e29c31" Feb 13 19:53:31.023857 containerd[1536]: time="2025-02-13T19:53:31.023837648Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:31.023911 containerd[1536]: time="2025-02-13T19:53:31.023896717Z" level=info msg="TearDown network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" successfully" Feb 13 19:53:31.023911 containerd[1536]: time="2025-02-13T19:53:31.023906645Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" returns successfully" Feb 13 19:53:31.051542 containerd[1536]: time="2025-02-13T19:53:31.050753009Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:31.052146 containerd[1536]: time="2025-02-13T19:53:31.051793797Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:31.052199 containerd[1536]: time="2025-02-13T19:53:31.051815592Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:31.066658 containerd[1536]: time="2025-02-13T19:53:31.066633748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:4,}" Feb 13 19:53:31.085502 kubelet[2855]: I0213 19:53:31.085203 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375" Feb 13 19:53:31.086463 containerd[1536]: time="2025-02-13T19:53:31.086445419Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\"" Feb 13 19:53:31.109663 containerd[1536]: time="2025-02-13T19:53:31.109639810Z" level=info msg="Ensure that sandbox de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375 in task-service has been cleanup successfully" Feb 13 19:53:31.109877 containerd[1536]: time="2025-02-13T19:53:31.109866558Z" level=info msg="TearDown network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" successfully" Feb 13 19:53:31.109930 containerd[1536]: time="2025-02-13T19:53:31.109922079Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" returns successfully" Feb 13 19:53:31.110457 containerd[1536]: time="2025-02-13T19:53:31.110443689Z" level=error msg="Failed to destroy network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.110697 containerd[1536]: time="2025-02-13T19:53:31.110674919Z" level=error msg="encountered an error cleaning up failed sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.110774 containerd[1536]: time="2025-02-13T19:53:31.110751419Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.115192 kubelet[2855]: E0213 19:53:31.115171 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.116022 kubelet[2855]: E0213 19:53:31.115856 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:31.116022 kubelet[2855]: E0213 19:53:31.115874 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:31.116022 kubelet[2855]: E0213 19:53:31.115910 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" podUID="9f3ee88c-5d3d-4b58-85b2-38498875ee0d" Feb 13 19:53:31.117866 containerd[1536]: time="2025-02-13T19:53:31.117665655Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" Feb 13 19:53:31.117866 containerd[1536]: time="2025-02-13T19:53:31.117725396Z" level=info msg="TearDown network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" successfully" Feb 13 19:53:31.117866 containerd[1536]: time="2025-02-13T19:53:31.117733792Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" returns successfully" Feb 13 19:53:31.130439 containerd[1536]: time="2025-02-13T19:53:31.130416155Z" level=error msg="Failed to destroy network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.130692 containerd[1536]: time="2025-02-13T19:53:31.130677415Z" level=error msg="encountered an error cleaning up failed sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.130769 containerd[1536]: time="2025-02-13T19:53:31.130756748Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.143443 kubelet[2855]: E0213 19:53:31.143373 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.143443 kubelet[2855]: E0213 19:53:31.143408 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:31.143443 kubelet[2855]: E0213 19:53:31.143421 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:31.143547 kubelet[2855]: E0213 19:53:31.143451 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" podUID="b7841c7a-d63f-4929-a2cc-5262cd7ba254" Feb 13 19:53:31.144458 containerd[1536]: time="2025-02-13T19:53:31.144376907Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:31.144458 containerd[1536]: time="2025-02-13T19:53:31.144437871Z" level=info msg="TearDown network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" successfully" Feb 13 19:53:31.144458 containerd[1536]: time="2025-02-13T19:53:31.144448145Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" returns successfully" Feb 13 19:53:31.144724 containerd[1536]: time="2025-02-13T19:53:31.144713666Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:31.145419 containerd[1536]: time="2025-02-13T19:53:31.145408201Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:31.145464 containerd[1536]: time="2025-02-13T19:53:31.145456812Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:31.145705 containerd[1536]: time="2025-02-13T19:53:31.145694835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:4,}" Feb 13 19:53:31.174151 containerd[1536]: time="2025-02-13T19:53:31.174126488Z" level=error msg="Failed to destroy network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.174435 containerd[1536]: time="2025-02-13T19:53:31.174396013Z" level=error msg="encountered an error cleaning up failed sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.185363 containerd[1536]: time="2025-02-13T19:53:31.174495380Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.185430 kubelet[2855]: E0213 19:53:31.174612 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.185430 kubelet[2855]: E0213 19:53:31.174646 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:31.185430 kubelet[2855]: E0213 19:53:31.174658 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:31.185501 kubelet[2855]: E0213 19:53:31.174682 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-855555987-xzhkk" podUID="9f58da82-3497-4839-8c99-878960089137" Feb 13 19:53:31.253251 containerd[1536]: time="2025-02-13T19:53:31.253176761Z" level=error msg="Failed to destroy network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.254377 containerd[1536]: time="2025-02-13T19:53:31.254097585Z" level=error msg="encountered an error cleaning up failed sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.254377 containerd[1536]: time="2025-02-13T19:53:31.254155815Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.254670 kubelet[2855]: E0213 19:53:31.254506 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.255334 kubelet[2855]: E0213 19:53:31.254731 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:31.255334 kubelet[2855]: E0213 19:53:31.254749 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:31.255334 kubelet[2855]: E0213 19:53:31.254774 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-t5g9g" podUID="e2117422-eaf7-4842-8ba3-82e570607dc9" Feb 13 19:53:31.299577 kubelet[2855]: I0213 19:53:31.299500 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573" Feb 13 19:53:31.326292 containerd[1536]: time="2025-02-13T19:53:31.326092789Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\"" Feb 13 19:53:31.326292 containerd[1536]: time="2025-02-13T19:53:31.326214232Z" level=info msg="Ensure that sandbox 02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573 in task-service has been cleanup successfully" Feb 13 19:53:31.326437 containerd[1536]: time="2025-02-13T19:53:31.326426770Z" level=info msg="TearDown network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" successfully" Feb 13 19:53:31.326589 containerd[1536]: time="2025-02-13T19:53:31.326476307Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" returns successfully" Feb 13 19:53:31.328672 containerd[1536]: time="2025-02-13T19:53:31.328655740Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" Feb 13 19:53:31.328722 containerd[1536]: time="2025-02-13T19:53:31.328710279Z" level=info msg="TearDown network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" successfully" Feb 13 19:53:31.328746 containerd[1536]: time="2025-02-13T19:53:31.328720267Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" returns successfully" Feb 13 19:53:31.329145 containerd[1536]: time="2025-02-13T19:53:31.329133890Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:31.329232 containerd[1536]: time="2025-02-13T19:53:31.329219555Z" level=info msg="TearDown network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" successfully" Feb 13 19:53:31.329270 containerd[1536]: time="2025-02-13T19:53:31.329263197Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" returns successfully" Feb 13 19:53:31.330008 containerd[1536]: time="2025-02-13T19:53:31.329498810Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:31.330008 containerd[1536]: time="2025-02-13T19:53:31.329541583Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:31.330008 containerd[1536]: time="2025-02-13T19:53:31.329552191Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:31.331106 containerd[1536]: time="2025-02-13T19:53:31.331091177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:4,}" Feb 13 19:53:31.392266 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895-shm.mount: Deactivated successfully. Feb 13 19:53:31.392444 systemd[1]: run-netns-cni\x2d69eefca3\x2d81ca\x2d97a4\x2d99ac\x2da81bfff58a1e.mount: Deactivated successfully. Feb 13 19:53:31.392542 systemd[1]: run-netns-cni\x2dde7ed093\x2dd443\x2daa7d\x2d8ebc\x2d22c7e413fb4d.mount: Deactivated successfully. Feb 13 19:53:31.392712 systemd[1]: run-netns-cni\x2d47dc9937\x2da7f3\x2d5a12\x2ddce9\x2d386408c0549e.mount: Deactivated successfully. Feb 13 19:53:31.517297 containerd[1536]: time="2025-02-13T19:53:31.517175619Z" level=error msg="Failed to destroy network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.519133 containerd[1536]: time="2025-02-13T19:53:31.518753129Z" level=error msg="encountered an error cleaning up failed sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.519133 containerd[1536]: time="2025-02-13T19:53:31.518793816Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.519200 kubelet[2855]: E0213 19:53:31.518915 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:31.519200 kubelet[2855]: E0213 19:53:31.518946 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:31.519200 kubelet[2855]: E0213 19:53:31.518963 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:31.527679 kubelet[2855]: E0213 19:53:31.519005 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:31.520032 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb-shm.mount: Deactivated successfully. Feb 13 19:53:31.620736 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1014175821.mount: Deactivated successfully. Feb 13 19:53:31.991115 containerd[1536]: time="2025-02-13T19:53:31.991082268Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:32.009722 containerd[1536]: time="2025-02-13T19:53:32.009687862Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Feb 13 19:53:32.043322 containerd[1536]: time="2025-02-13T19:53:32.043287796Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:32.070027 containerd[1536]: time="2025-02-13T19:53:32.069462146Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:32.096956 containerd[1536]: time="2025-02-13T19:53:32.096929274Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 5.426104238s" Feb 13 19:53:32.097411 containerd[1536]: time="2025-02-13T19:53:32.096956922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Feb 13 19:53:32.165959 containerd[1536]: time="2025-02-13T19:53:32.165930636Z" level=info msg="CreateContainer within sandbox \"85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Feb 13 19:53:32.232237 containerd[1536]: time="2025-02-13T19:53:32.232039402Z" level=info msg="CreateContainer within sandbox \"85445a8ae21fb16518a5170c51a1ef6ea36d9662caf77b30c7aa1e8199cbea54\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b0162e2dd12bf553d59a8e4d93b28b7cf0311127e8be66e8063d690cd854067b\"" Feb 13 19:53:32.234521 containerd[1536]: time="2025-02-13T19:53:32.232439343Z" level=info msg="StartContainer for \"b0162e2dd12bf553d59a8e4d93b28b7cf0311127e8be66e8063d690cd854067b\"" Feb 13 19:53:32.320888 kubelet[2855]: I0213 19:53:32.320825 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb" Feb 13 19:53:32.323006 containerd[1536]: time="2025-02-13T19:53:32.322152951Z" level=info msg="StopPodSandbox for \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\"" Feb 13 19:53:32.323006 containerd[1536]: time="2025-02-13T19:53:32.322262565Z" level=info msg="Ensure that sandbox 31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb in task-service has been cleanup successfully" Feb 13 19:53:32.323074 kubelet[2855]: I0213 19:53:32.323053 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895" Feb 13 19:53:32.323782 containerd[1536]: time="2025-02-13T19:53:32.323108766Z" level=info msg="TearDown network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" successfully" Feb 13 19:53:32.323782 containerd[1536]: time="2025-02-13T19:53:32.323119417Z" level=info msg="StopPodSandbox for \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" returns successfully" Feb 13 19:53:32.323782 containerd[1536]: time="2025-02-13T19:53:32.323362051Z" level=info msg="StopPodSandbox for \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\"" Feb 13 19:53:32.323782 containerd[1536]: time="2025-02-13T19:53:32.323570521Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\"" Feb 13 19:53:32.323782 containerd[1536]: time="2025-02-13T19:53:32.323615984Z" level=info msg="TearDown network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" successfully" Feb 13 19:53:32.323782 containerd[1536]: time="2025-02-13T19:53:32.323622577Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" returns successfully" Feb 13 19:53:32.324657 containerd[1536]: time="2025-02-13T19:53:32.323918321Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" Feb 13 19:53:32.324657 containerd[1536]: time="2025-02-13T19:53:32.324529954Z" level=info msg="TearDown network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" successfully" Feb 13 19:53:32.324657 containerd[1536]: time="2025-02-13T19:53:32.324538280Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" returns successfully" Feb 13 19:53:32.324813 containerd[1536]: time="2025-02-13T19:53:32.324795853Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:32.325687 containerd[1536]: time="2025-02-13T19:53:32.324874087Z" level=info msg="TearDown network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" successfully" Feb 13 19:53:32.325687 containerd[1536]: time="2025-02-13T19:53:32.324883415Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" returns successfully" Feb 13 19:53:32.325687 containerd[1536]: time="2025-02-13T19:53:32.325172979Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:32.325687 containerd[1536]: time="2025-02-13T19:53:32.325209309Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:32.325687 containerd[1536]: time="2025-02-13T19:53:32.325215741Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:32.326286 containerd[1536]: time="2025-02-13T19:53:32.325885838Z" level=info msg="Ensure that sandbox 15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895 in task-service has been cleanup successfully" Feb 13 19:53:32.326286 containerd[1536]: time="2025-02-13T19:53:32.326106544Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:5,}" Feb 13 19:53:32.326446 containerd[1536]: time="2025-02-13T19:53:32.326430327Z" level=info msg="TearDown network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" successfully" Feb 13 19:53:32.326446 containerd[1536]: time="2025-02-13T19:53:32.326441411Z" level=info msg="StopPodSandbox for \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" returns successfully" Feb 13 19:53:32.326577 containerd[1536]: time="2025-02-13T19:53:32.326563694Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\"" Feb 13 19:53:32.326913 containerd[1536]: time="2025-02-13T19:53:32.326899582Z" level=info msg="TearDown network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" successfully" Feb 13 19:53:32.326913 containerd[1536]: time="2025-02-13T19:53:32.326908848Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.327747057Z" level=info msg="StopPodSandbox for \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.327845845Z" level=info msg="Ensure that sandbox 4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5 in task-service has been cleanup successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328184856Z" level=info msg="TearDown network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328193448Z" level=info msg="StopPodSandbox for \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328235069Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328272444Z" level=info msg="TearDown network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328278427Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328429568Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328473832Z" level=info msg="TearDown network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328479972Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328555054Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328586509Z" level=info msg="TearDown network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328601592Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328801261Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328852245Z" level=info msg="TearDown network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328859573Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328892772Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328958339Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.328967003Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.329259045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:5,}" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.329496699Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.329544467Z" level=info msg="TearDown network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.329559731Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.330087466Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.330126671Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.330132601Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:32.330829 containerd[1536]: time="2025-02-13T19:53:32.330711778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:5,}" Feb 13 19:53:32.334876 kubelet[2855]: I0213 19:53:32.327212 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5" Feb 13 19:53:32.334876 kubelet[2855]: I0213 19:53:32.331497 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3" Feb 13 19:53:32.334925 containerd[1536]: time="2025-02-13T19:53:32.332827294Z" level=info msg="StopPodSandbox for \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\"" Feb 13 19:53:32.334925 containerd[1536]: time="2025-02-13T19:53:32.332927627Z" level=info msg="Ensure that sandbox 32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3 in task-service has been cleanup successfully" Feb 13 19:53:32.334925 containerd[1536]: time="2025-02-13T19:53:32.333421179Z" level=info msg="TearDown network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" successfully" Feb 13 19:53:32.334925 containerd[1536]: time="2025-02-13T19:53:32.333431724Z" level=info msg="StopPodSandbox for \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" returns successfully" Feb 13 19:53:32.334925 containerd[1536]: time="2025-02-13T19:53:32.333692371Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\"" Feb 13 19:53:32.336666 containerd[1536]: time="2025-02-13T19:53:32.335831013Z" level=info msg="TearDown network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" successfully" Feb 13 19:53:32.336666 containerd[1536]: time="2025-02-13T19:53:32.335843521Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" returns successfully" Feb 13 19:53:32.336666 containerd[1536]: time="2025-02-13T19:53:32.336413279Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" Feb 13 19:53:32.336666 containerd[1536]: time="2025-02-13T19:53:32.336526224Z" level=info msg="TearDown network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" successfully" Feb 13 19:53:32.336666 containerd[1536]: time="2025-02-13T19:53:32.336619965Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" returns successfully" Feb 13 19:53:32.337323 containerd[1536]: time="2025-02-13T19:53:32.336967967Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:32.337323 containerd[1536]: time="2025-02-13T19:53:32.337316046Z" level=info msg="TearDown network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" successfully" Feb 13 19:53:32.337394 kubelet[2855]: I0213 19:53:32.337091 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929" Feb 13 19:53:32.337423 containerd[1536]: time="2025-02-13T19:53:32.337324189Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" returns successfully" Feb 13 19:53:32.337587 containerd[1536]: time="2025-02-13T19:53:32.337555313Z" level=info msg="StopPodSandbox for \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\"" Feb 13 19:53:32.338639 containerd[1536]: time="2025-02-13T19:53:32.338570183Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:32.338765 containerd[1536]: time="2025-02-13T19:53:32.338662041Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:32.338765 containerd[1536]: time="2025-02-13T19:53:32.338723806Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:32.338813 containerd[1536]: time="2025-02-13T19:53:32.338803327Z" level=info msg="Ensure that sandbox 63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929 in task-service has been cleanup successfully" Feb 13 19:53:32.339470 containerd[1536]: time="2025-02-13T19:53:32.339407565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:5,}" Feb 13 19:53:32.339703 containerd[1536]: time="2025-02-13T19:53:32.339612979Z" level=info msg="TearDown network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" successfully" Feb 13 19:53:32.339703 containerd[1536]: time="2025-02-13T19:53:32.339623231Z" level=info msg="StopPodSandbox for \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" returns successfully" Feb 13 19:53:32.340368 containerd[1536]: time="2025-02-13T19:53:32.340354224Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\"" Feb 13 19:53:32.340409 containerd[1536]: time="2025-02-13T19:53:32.340394590Z" level=info msg="TearDown network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" successfully" Feb 13 19:53:32.340409 containerd[1536]: time="2025-02-13T19:53:32.340400648Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" returns successfully" Feb 13 19:53:32.340643 containerd[1536]: time="2025-02-13T19:53:32.340629572Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" Feb 13 19:53:32.340887 containerd[1536]: time="2025-02-13T19:53:32.340835376Z" level=info msg="TearDown network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" successfully" Feb 13 19:53:32.340887 containerd[1536]: time="2025-02-13T19:53:32.340843912Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" returns successfully" Feb 13 19:53:32.341913 containerd[1536]: time="2025-02-13T19:53:32.341899641Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:32.341949 containerd[1536]: time="2025-02-13T19:53:32.341939924Z" level=info msg="TearDown network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" successfully" Feb 13 19:53:32.341949 containerd[1536]: time="2025-02-13T19:53:32.341946591Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" returns successfully" Feb 13 19:53:32.342736 containerd[1536]: time="2025-02-13T19:53:32.342721650Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:32.342769 containerd[1536]: time="2025-02-13T19:53:32.342764147Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:32.342843 containerd[1536]: time="2025-02-13T19:53:32.342770232Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:32.343497 containerd[1536]: time="2025-02-13T19:53:32.343463880Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:5,}" Feb 13 19:53:32.344403 kubelet[2855]: I0213 19:53:32.344389 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3" Feb 13 19:53:32.345766 containerd[1536]: time="2025-02-13T19:53:32.345743825Z" level=info msg="StopPodSandbox for \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\"" Feb 13 19:53:32.346105 containerd[1536]: time="2025-02-13T19:53:32.346091151Z" level=info msg="Ensure that sandbox e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3 in task-service has been cleanup successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.346545047Z" level=info msg="TearDown network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.346555233Z" level=info msg="StopPodSandbox for \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" returns successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.347104480Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\"" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.347145016Z" level=info msg="TearDown network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.347150806Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" returns successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.347757950Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348033108Z" level=info msg="TearDown network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348040961Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" returns successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348408294Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348455146Z" level=info msg="TearDown network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348461825Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" returns successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348594925Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348916772Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:32.349029 containerd[1536]: time="2025-02-13T19:53:32.348924868Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:32.349939 containerd[1536]: time="2025-02-13T19:53:32.349397057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:5,}" Feb 13 19:53:32.359097 systemd[1]: Started cri-containerd-b0162e2dd12bf553d59a8e4d93b28b7cf0311127e8be66e8063d690cd854067b.scope - libcontainer container b0162e2dd12bf553d59a8e4d93b28b7cf0311127e8be66e8063d690cd854067b. Feb 13 19:53:32.408913 systemd[1]: run-netns-cni\x2d58ec8501\x2d95d0\x2ddaed\x2df811\x2df2dd55d3db0c.mount: Deactivated successfully. Feb 13 19:53:32.408975 systemd[1]: run-netns-cni\x2d04865f3e\x2d619a\x2dc31c\x2d6f1f\x2d5fced6cee0a5.mount: Deactivated successfully. Feb 13 19:53:32.409024 systemd[1]: run-netns-cni\x2d02165c27\x2d0349\x2db1ac\x2d72c3\x2dda4450c48a39.mount: Deactivated successfully. Feb 13 19:53:32.409058 systemd[1]: run-netns-cni\x2d9adbd599\x2d166d\x2d0be1\x2daab0\x2d308635e19fda.mount: Deactivated successfully. Feb 13 19:53:32.409091 systemd[1]: run-netns-cni\x2dae00df0f\x2dbc29\x2dbf94\x2dd023\x2dbe8c246a2c01.mount: Deactivated successfully. Feb 13 19:53:32.409122 systemd[1]: run-netns-cni\x2dc58e2c86\x2dab79\x2d3101\x2dc827\x2d62fa1b5265f3.mount: Deactivated successfully. Feb 13 19:53:32.446931 containerd[1536]: time="2025-02-13T19:53:32.446401847Z" level=info msg="StartContainer for \"b0162e2dd12bf553d59a8e4d93b28b7cf0311127e8be66e8063d690cd854067b\" returns successfully" Feb 13 19:53:32.474907 containerd[1536]: time="2025-02-13T19:53:32.474799008Z" level=error msg="Failed to destroy network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.476758 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc-shm.mount: Deactivated successfully. Feb 13 19:53:32.478547 containerd[1536]: time="2025-02-13T19:53:32.478528198Z" level=error msg="encountered an error cleaning up failed sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.479370 containerd[1536]: time="2025-02-13T19:53:32.479355669Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.479630 kubelet[2855]: E0213 19:53:32.479607 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.479666 kubelet[2855]: E0213 19:53:32.479641 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:32.479666 kubelet[2855]: E0213 19:53:32.479654 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-t5g9g" Feb 13 19:53:32.479871 kubelet[2855]: E0213 19:53:32.479685 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-t5g9g_kube-system(e2117422-eaf7-4842-8ba3-82e570607dc9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-t5g9g" podUID="e2117422-eaf7-4842-8ba3-82e570607dc9" Feb 13 19:53:32.496834 containerd[1536]: time="2025-02-13T19:53:32.496795274Z" level=error msg="Failed to destroy network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.497136 containerd[1536]: time="2025-02-13T19:53:32.497117719Z" level=error msg="encountered an error cleaning up failed sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.497177 containerd[1536]: time="2025-02-13T19:53:32.497160783Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.500575 kubelet[2855]: E0213 19:53:32.497285 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.500575 kubelet[2855]: E0213 19:53:32.500503 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:32.498728 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996-shm.mount: Deactivated successfully. Feb 13 19:53:32.502115 kubelet[2855]: E0213 19:53:32.501326 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-rknrr" Feb 13 19:53:32.502115 kubelet[2855]: E0213 19:53:32.501367 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-rknrr_kube-system(9644224c-d01a-42e1-9f2e-df8377e29c31)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-rknrr" podUID="9644224c-d01a-42e1-9f2e-df8377e29c31" Feb 13 19:53:32.504912 containerd[1536]: time="2025-02-13T19:53:32.504886105Z" level=error msg="Failed to destroy network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.505100 containerd[1536]: time="2025-02-13T19:53:32.505083702Z" level=error msg="encountered an error cleaning up failed sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.505132 containerd[1536]: time="2025-02-13T19:53:32.505121581Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.505650 kubelet[2855]: E0213 19:53:32.505623 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.505726 kubelet[2855]: E0213 19:53:32.505715 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:32.505781 kubelet[2855]: E0213 19:53:32.505773 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-kzqv2" Feb 13 19:53:32.505937 kubelet[2855]: E0213 19:53:32.505831 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-kzqv2_calico-system(34af207b-f303-480e-a684-e96e850daafd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-kzqv2" podUID="34af207b-f303-480e-a684-e96e850daafd" Feb 13 19:53:32.507455 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6-shm.mount: Deactivated successfully. Feb 13 19:53:32.513606 containerd[1536]: time="2025-02-13T19:53:32.511544932Z" level=error msg="Failed to destroy network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.513606 containerd[1536]: time="2025-02-13T19:53:32.512539627Z" level=error msg="encountered an error cleaning up failed sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.513606 containerd[1536]: time="2025-02-13T19:53:32.512576413Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.513560 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021-shm.mount: Deactivated successfully. Feb 13 19:53:32.515650 kubelet[2855]: E0213 19:53:32.512684 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.515650 kubelet[2855]: E0213 19:53:32.512713 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:32.515650 kubelet[2855]: E0213 19:53:32.512725 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" Feb 13 19:53:32.515798 kubelet[2855]: E0213 19:53:32.512748 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-2lrbj_calico-apiserver(b7841c7a-d63f-4929-a2cc-5262cd7ba254)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" podUID="b7841c7a-d63f-4929-a2cc-5262cd7ba254" Feb 13 19:53:32.516213 containerd[1536]: time="2025-02-13T19:53:32.516195875Z" level=error msg="Failed to destroy network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.516359 containerd[1536]: time="2025-02-13T19:53:32.516210148Z" level=error msg="Failed to destroy network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.516729 containerd[1536]: time="2025-02-13T19:53:32.516677851Z" level=error msg="encountered an error cleaning up failed sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.516843 containerd[1536]: time="2025-02-13T19:53:32.516714587Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.516968 containerd[1536]: time="2025-02-13T19:53:32.516923226Z" level=error msg="encountered an error cleaning up failed sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.516968 containerd[1536]: time="2025-02-13T19:53:32.516950158Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.517736 kubelet[2855]: E0213 19:53:32.517581 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.517736 kubelet[2855]: E0213 19:53:32.517662 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:32.517736 kubelet[2855]: E0213 19:53:32.517670 2855 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Feb 13 19:53:32.517736 kubelet[2855]: E0213 19:53:32.517675 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-855555987-xzhkk" Feb 13 19:53:32.517830 kubelet[2855]: E0213 19:53:32.517687 2855 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:32.517830 kubelet[2855]: E0213 19:53:32.517698 2855 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" Feb 13 19:53:32.517830 kubelet[2855]: E0213 19:53:32.517697 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-855555987-xzhkk_calico-system(9f58da82-3497-4839-8c99-878960089137)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-855555987-xzhkk" podUID="9f58da82-3497-4839-8c99-878960089137" Feb 13 19:53:32.517899 kubelet[2855]: E0213 19:53:32.517714 2855 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-66fd84fdb4-cwm4r_calico-apiserver(9f3ee88c-5d3d-4b58-85b2-38498875ee0d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" podUID="9f3ee88c-5d3d-4b58-85b2-38498875ee0d" Feb 13 19:53:32.659035 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Feb 13 19:53:32.661244 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Feb 13 19:53:33.279911 kubelet[2855]: I0213 19:53:33.279854 2855 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:53:33.347999 kubelet[2855]: I0213 19:53:33.347952 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb" Feb 13 19:53:33.349420 containerd[1536]: time="2025-02-13T19:53:33.348780869Z" level=info msg="StopPodSandbox for \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\"" Feb 13 19:53:33.349420 containerd[1536]: time="2025-02-13T19:53:33.348898959Z" level=info msg="Ensure that sandbox 47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb in task-service has been cleanup successfully" Feb 13 19:53:33.349952 kubelet[2855]: I0213 19:53:33.349649 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc" Feb 13 19:53:33.350511 containerd[1536]: time="2025-02-13T19:53:33.350499129Z" level=info msg="StopPodSandbox for \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\"" Feb 13 19:53:33.351095 containerd[1536]: time="2025-02-13T19:53:33.350745694Z" level=info msg="Ensure that sandbox 5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc in task-service has been cleanup successfully" Feb 13 19:53:33.351095 containerd[1536]: time="2025-02-13T19:53:33.350502507Z" level=info msg="TearDown network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\" successfully" Feb 13 19:53:33.351095 containerd[1536]: time="2025-02-13T19:53:33.350840485Z" level=info msg="StopPodSandbox for \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\" returns successfully" Feb 13 19:53:33.351095 containerd[1536]: time="2025-02-13T19:53:33.350954399Z" level=info msg="TearDown network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\" successfully" Feb 13 19:53:33.351291 containerd[1536]: time="2025-02-13T19:53:33.350962678Z" level=info msg="StopPodSandbox for \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\" returns successfully" Feb 13 19:53:33.351291 containerd[1536]: time="2025-02-13T19:53:33.351107699Z" level=info msg="StopPodSandbox for \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\"" Feb 13 19:53:33.351291 containerd[1536]: time="2025-02-13T19:53:33.351253226Z" level=info msg="TearDown network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" successfully" Feb 13 19:53:33.351291 containerd[1536]: time="2025-02-13T19:53:33.351259042Z" level=info msg="StopPodSandbox for \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" returns successfully" Feb 13 19:53:33.351883 containerd[1536]: time="2025-02-13T19:53:33.351859278Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\"" Feb 13 19:53:33.351941 containerd[1536]: time="2025-02-13T19:53:33.351911230Z" level=info msg="TearDown network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" successfully" Feb 13 19:53:33.351941 containerd[1536]: time="2025-02-13T19:53:33.351938427Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" returns successfully" Feb 13 19:53:33.351995 containerd[1536]: time="2025-02-13T19:53:33.351968417Z" level=info msg="StopPodSandbox for \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\"" Feb 13 19:53:33.352626 containerd[1536]: time="2025-02-13T19:53:33.352532758Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" Feb 13 19:53:33.352626 containerd[1536]: time="2025-02-13T19:53:33.352541623Z" level=info msg="TearDown network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" successfully" Feb 13 19:53:33.352626 containerd[1536]: time="2025-02-13T19:53:33.352550612Z" level=info msg="StopPodSandbox for \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" returns successfully" Feb 13 19:53:33.352626 containerd[1536]: time="2025-02-13T19:53:33.352574597Z" level=info msg="TearDown network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" successfully" Feb 13 19:53:33.352626 containerd[1536]: time="2025-02-13T19:53:33.352581060Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" returns successfully" Feb 13 19:53:33.353092 containerd[1536]: time="2025-02-13T19:53:33.352941703Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\"" Feb 13 19:53:33.353092 containerd[1536]: time="2025-02-13T19:53:33.353025465Z" level=info msg="TearDown network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" successfully" Feb 13 19:53:33.353092 containerd[1536]: time="2025-02-13T19:53:33.353033857Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" returns successfully" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353153293Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353196061Z" level=info msg="TearDown network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" successfully" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353201786Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" returns successfully" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353233142Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353264940Z" level=info msg="TearDown network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" successfully" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353270486Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" returns successfully" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353313278Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353348629Z" level=info msg="TearDown network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" successfully" Feb 13 19:53:33.353383 containerd[1536]: time="2025-02-13T19:53:33.353353570Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" returns successfully" Feb 13 19:53:33.354564 containerd[1536]: time="2025-02-13T19:53:33.354114652Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:33.354564 containerd[1536]: time="2025-02-13T19:53:33.354161022Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:33.354564 containerd[1536]: time="2025-02-13T19:53:33.354171688Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:33.354564 containerd[1536]: time="2025-02-13T19:53:33.354201974Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:33.354564 containerd[1536]: time="2025-02-13T19:53:33.354232722Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:33.354564 containerd[1536]: time="2025-02-13T19:53:33.354238740Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:33.355152 containerd[1536]: time="2025-02-13T19:53:33.354673400Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:6,}" Feb 13 19:53:33.355152 containerd[1536]: time="2025-02-13T19:53:33.355008525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:6,}" Feb 13 19:53:33.360802 kubelet[2855]: I0213 19:53:33.359020 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6" Feb 13 19:53:33.360885 containerd[1536]: time="2025-02-13T19:53:33.360431431Z" level=info msg="StopPodSandbox for \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\"" Feb 13 19:53:33.360885 containerd[1536]: time="2025-02-13T19:53:33.360561415Z" level=info msg="Ensure that sandbox 624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6 in task-service has been cleanup successfully" Feb 13 19:53:33.362998 containerd[1536]: time="2025-02-13T19:53:33.362975458Z" level=info msg="TearDown network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\" successfully" Feb 13 19:53:33.363910 containerd[1536]: time="2025-02-13T19:53:33.363759141Z" level=info msg="StopPodSandbox for \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\" returns successfully" Feb 13 19:53:33.364669 containerd[1536]: time="2025-02-13T19:53:33.364374880Z" level=info msg="StopPodSandbox for \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\"" Feb 13 19:53:33.364669 containerd[1536]: time="2025-02-13T19:53:33.364461788Z" level=info msg="TearDown network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" successfully" Feb 13 19:53:33.364669 containerd[1536]: time="2025-02-13T19:53:33.364479423Z" level=info msg="StopPodSandbox for \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" returns successfully" Feb 13 19:53:33.366157 containerd[1536]: time="2025-02-13T19:53:33.365202475Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\"" Feb 13 19:53:33.366157 containerd[1536]: time="2025-02-13T19:53:33.365256715Z" level=info msg="TearDown network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" successfully" Feb 13 19:53:33.366157 containerd[1536]: time="2025-02-13T19:53:33.365263427Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" returns successfully" Feb 13 19:53:33.366157 containerd[1536]: time="2025-02-13T19:53:33.365452746Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" Feb 13 19:53:33.366157 containerd[1536]: time="2025-02-13T19:53:33.365494334Z" level=info msg="TearDown network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" successfully" Feb 13 19:53:33.366157 containerd[1536]: time="2025-02-13T19:53:33.365503494Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" returns successfully" Feb 13 19:53:33.366814 containerd[1536]: time="2025-02-13T19:53:33.366688933Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:33.366814 containerd[1536]: time="2025-02-13T19:53:33.366740472Z" level=info msg="TearDown network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" successfully" Feb 13 19:53:33.366814 containerd[1536]: time="2025-02-13T19:53:33.366785441Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" returns successfully" Feb 13 19:53:33.368157 containerd[1536]: time="2025-02-13T19:53:33.368142221Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:33.368221 containerd[1536]: time="2025-02-13T19:53:33.368183840Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:33.368221 containerd[1536]: time="2025-02-13T19:53:33.368211500Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:33.368744 containerd[1536]: time="2025-02-13T19:53:33.368729027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:6,}" Feb 13 19:53:33.369027 kubelet[2855]: I0213 19:53:33.368973 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996" Feb 13 19:53:33.370270 containerd[1536]: time="2025-02-13T19:53:33.369881135Z" level=info msg="StopPodSandbox for \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\"" Feb 13 19:53:33.370270 containerd[1536]: time="2025-02-13T19:53:33.370003963Z" level=info msg="Ensure that sandbox 7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996 in task-service has been cleanup successfully" Feb 13 19:53:33.370270 containerd[1536]: time="2025-02-13T19:53:33.370156409Z" level=info msg="TearDown network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\" successfully" Feb 13 19:53:33.370270 containerd[1536]: time="2025-02-13T19:53:33.370180335Z" level=info msg="StopPodSandbox for \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\" returns successfully" Feb 13 19:53:33.373341 containerd[1536]: time="2025-02-13T19:53:33.372678533Z" level=info msg="StopPodSandbox for \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\"" Feb 13 19:53:33.374727 containerd[1536]: time="2025-02-13T19:53:33.373808515Z" level=info msg="TearDown network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" successfully" Feb 13 19:53:33.374727 containerd[1536]: time="2025-02-13T19:53:33.373820085Z" level=info msg="StopPodSandbox for \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" returns successfully" Feb 13 19:53:33.375389 containerd[1536]: time="2025-02-13T19:53:33.375099974Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\"" Feb 13 19:53:33.375389 containerd[1536]: time="2025-02-13T19:53:33.375162517Z" level=info msg="TearDown network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" successfully" Feb 13 19:53:33.375389 containerd[1536]: time="2025-02-13T19:53:33.375171605Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" returns successfully" Feb 13 19:53:33.376825 containerd[1536]: time="2025-02-13T19:53:33.376355728Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" Feb 13 19:53:33.376825 containerd[1536]: time="2025-02-13T19:53:33.376573714Z" level=info msg="TearDown network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" successfully" Feb 13 19:53:33.376825 containerd[1536]: time="2025-02-13T19:53:33.376582080Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" returns successfully" Feb 13 19:53:33.377652 containerd[1536]: time="2025-02-13T19:53:33.377596179Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:33.377865 containerd[1536]: time="2025-02-13T19:53:33.377823934Z" level=info msg="TearDown network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" successfully" Feb 13 19:53:33.377865 containerd[1536]: time="2025-02-13T19:53:33.377833454Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" returns successfully" Feb 13 19:53:33.378567 kubelet[2855]: I0213 19:53:33.378219 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a" Feb 13 19:53:33.379293 containerd[1536]: time="2025-02-13T19:53:33.379279113Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:33.379345 containerd[1536]: time="2025-02-13T19:53:33.379329497Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:33.379345 containerd[1536]: time="2025-02-13T19:53:33.379336103Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:33.385638 containerd[1536]: time="2025-02-13T19:53:33.381534106Z" level=info msg="StopPodSandbox for \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\"" Feb 13 19:53:33.385638 containerd[1536]: time="2025-02-13T19:53:33.384147731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:6,}" Feb 13 19:53:33.386824 containerd[1536]: time="2025-02-13T19:53:33.386705933Z" level=info msg="Ensure that sandbox 261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a in task-service has been cleanup successfully" Feb 13 19:53:33.387320 kubelet[2855]: I0213 19:53:33.387304 2855 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021" Feb 13 19:53:33.387463 containerd[1536]: time="2025-02-13T19:53:33.387405159Z" level=info msg="TearDown network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\" successfully" Feb 13 19:53:33.387463 containerd[1536]: time="2025-02-13T19:53:33.387416016Z" level=info msg="StopPodSandbox for \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\" returns successfully" Feb 13 19:53:33.388099 containerd[1536]: time="2025-02-13T19:53:33.387505602Z" level=info msg="StopPodSandbox for \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\"" Feb 13 19:53:33.388437 containerd[1536]: time="2025-02-13T19:53:33.388362207Z" level=info msg="Ensure that sandbox 3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021 in task-service has been cleanup successfully" Feb 13 19:53:33.388522 containerd[1536]: time="2025-02-13T19:53:33.388511342Z" level=info msg="TearDown network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\" successfully" Feb 13 19:53:33.388574 containerd[1536]: time="2025-02-13T19:53:33.388566748Z" level=info msg="StopPodSandbox for \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\" returns successfully" Feb 13 19:53:33.389359 containerd[1536]: time="2025-02-13T19:53:33.389325263Z" level=info msg="StopPodSandbox for \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\"" Feb 13 19:53:33.389684 containerd[1536]: time="2025-02-13T19:53:33.389456107Z" level=info msg="StopPodSandbox for \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\"" Feb 13 19:53:33.389684 containerd[1536]: time="2025-02-13T19:53:33.389554538Z" level=info msg="TearDown network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" successfully" Feb 13 19:53:33.389684 containerd[1536]: time="2025-02-13T19:53:33.389561550Z" level=info msg="StopPodSandbox for \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" returns successfully" Feb 13 19:53:33.389684 containerd[1536]: time="2025-02-13T19:53:33.389608219Z" level=info msg="TearDown network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" successfully" Feb 13 19:53:33.389684 containerd[1536]: time="2025-02-13T19:53:33.389614766Z" level=info msg="StopPodSandbox for \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" returns successfully" Feb 13 19:53:33.390047 containerd[1536]: time="2025-02-13T19:53:33.390012811Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\"" Feb 13 19:53:33.390155 containerd[1536]: time="2025-02-13T19:53:33.390145632Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\"" Feb 13 19:53:33.390366 containerd[1536]: time="2025-02-13T19:53:33.390112119Z" level=info msg="TearDown network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" successfully" Feb 13 19:53:33.390366 containerd[1536]: time="2025-02-13T19:53:33.390297056Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" returns successfully" Feb 13 19:53:33.390635 containerd[1536]: time="2025-02-13T19:53:33.390538577Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" Feb 13 19:53:33.390635 containerd[1536]: time="2025-02-13T19:53:33.390603467Z" level=info msg="TearDown network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" successfully" Feb 13 19:53:33.390635 containerd[1536]: time="2025-02-13T19:53:33.390610033Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" returns successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.390764151Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.390902232Z" level=info msg="TearDown network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.390909705Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" returns successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393054468Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393470121Z" level=info msg="TearDown network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393478731Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" returns successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393384413Z" level=info msg="TearDown network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393528095Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" returns successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393690812Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393773625Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393835716Z" level=info msg="TearDown network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393842871Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" returns successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393906477Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.393913631Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.394150278Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.394289837Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.394297079Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.395099574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:6,}" Feb 13 19:53:33.398923 containerd[1536]: time="2025-02-13T19:53:33.395273081Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:6,}" Feb 13 19:53:33.395888 systemd[1]: run-netns-cni\x2dade9885e\x2ddb50\x2dee3c\x2d5a76\x2d8af74a225f4f.mount: Deactivated successfully. Feb 13 19:53:33.396038 systemd[1]: run-netns-cni\x2d9051ad17\x2dfd2e\x2d036a\x2de206\x2da316e1f73aee.mount: Deactivated successfully. Feb 13 19:53:33.396078 systemd[1]: run-netns-cni\x2d3f07d50b\x2d2ed7\x2da5e9\x2d9523\x2dc6e6115472e1.mount: Deactivated successfully. Feb 13 19:53:33.396182 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb-shm.mount: Deactivated successfully. Feb 13 19:53:33.396262 systemd[1]: run-netns-cni\x2d7a568f6a\x2d1789\x2d17c5\x2dd418\x2db5462adcb50e.mount: Deactivated successfully. Feb 13 19:53:33.396319 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a-shm.mount: Deactivated successfully. Feb 13 19:53:33.396355 systemd[1]: run-netns-cni\x2d23f8d057\x2db474\x2dc349\x2d63ed\x2d720d4d4e13fd.mount: Deactivated successfully. Feb 13 19:53:33.396387 systemd[1]: run-netns-cni\x2db40f852f\x2d03fb\x2d390e\x2dd407\x2d53d1bd872124.mount: Deactivated successfully. Feb 13 19:53:33.561180 kubelet[2855]: I0213 19:53:33.379129 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mhzzt" podStartSLOduration=4.069234373 podStartE2EDuration="21.373215405s" podCreationTimestamp="2025-02-13 19:53:12 +0000 UTC" firstStartedPulling="2025-02-13 19:53:14.793640561 +0000 UTC m=+23.356274955" lastFinishedPulling="2025-02-13 19:53:32.097621591 +0000 UTC m=+40.660255987" observedRunningTime="2025-02-13 19:53:33.372259884 +0000 UTC m=+41.934894286" watchObservedRunningTime="2025-02-13 19:53:33.373215405 +0000 UTC m=+41.935849802" Feb 13 19:53:33.930467 systemd-networkd[1445]: cali0e85d43a35e: Link UP Feb 13 19:53:33.930573 systemd-networkd[1445]: cali0e85d43a35e: Gained carrier Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.597 [INFO][4840] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.645 [INFO][4840] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--kzqv2-eth0 csi-node-driver- calico-system 34af207b-f303-480e-a684-e96e850daafd 589 0 2025-02-13 19:53:13 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-kzqv2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0e85d43a35e [] []}} ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.645 [INFO][4840] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-eth0" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.881 [INFO][4891] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" HandleID="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Workload="localhost-k8s-csi--node--driver--kzqv2-eth0" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.900 [INFO][4891] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" HandleID="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Workload="localhost-k8s-csi--node--driver--kzqv2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000311a80), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-kzqv2", "timestamp":"2025-02-13 19:53:33.881900816 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.900 [INFO][4891] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.901 [INFO][4891] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.902 [INFO][4891] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.903 [INFO][4891] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.909 [INFO][4891] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.911 [INFO][4891] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.912 [INFO][4891] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.913 [INFO][4891] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.913 [INFO][4891] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.913 [INFO][4891] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311 Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.915 [INFO][4891] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.918 [INFO][4891] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.918 [INFO][4891] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" host="localhost" Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.918 [INFO][4891] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:53:33.946175 containerd[1536]: 2025-02-13 19:53:33.918 [INFO][4891] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" HandleID="k8s-pod-network.3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Workload="localhost-k8s-csi--node--driver--kzqv2-eth0" Feb 13 19:53:33.948230 containerd[1536]: 2025-02-13 19:53:33.919 [INFO][4840] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kzqv2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"34af207b-f303-480e-a684-e96e850daafd", ResourceVersion:"589", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-kzqv2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0e85d43a35e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:33.948230 containerd[1536]: 2025-02-13 19:53:33.919 [INFO][4840] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-eth0" Feb 13 19:53:33.948230 containerd[1536]: 2025-02-13 19:53:33.919 [INFO][4840] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0e85d43a35e ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-eth0" Feb 13 19:53:33.948230 containerd[1536]: 2025-02-13 19:53:33.930 [INFO][4840] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-eth0" Feb 13 19:53:33.948230 containerd[1536]: 2025-02-13 19:53:33.931 [INFO][4840] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--kzqv2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"34af207b-f303-480e-a684-e96e850daafd", ResourceVersion:"589", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311", Pod:"csi-node-driver-kzqv2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0e85d43a35e", MAC:"a6:19:ea:62:d0:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:33.948230 containerd[1536]: 2025-02-13 19:53:33.944 [INFO][4840] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311" Namespace="calico-system" Pod="csi-node-driver-kzqv2" WorkloadEndpoint="localhost-k8s-csi--node--driver--kzqv2-eth0" Feb 13 19:53:33.955244 systemd-networkd[1445]: cali21785a600c3: Link UP Feb 13 19:53:33.957389 systemd-networkd[1445]: cali21785a600c3: Gained carrier Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.602 [INFO][4849] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.645 [INFO][4849] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0 calico-apiserver-66fd84fdb4- calico-apiserver 9f3ee88c-5d3d-4b58-85b2-38498875ee0d 687 0 2025-02-13 19:53:12 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:66fd84fdb4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-66fd84fdb4-cwm4r eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali21785a600c3 [] []}} ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.645 [INFO][4849] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.881 [INFO][4895] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" HandleID="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Workload="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.900 [INFO][4895] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" HandleID="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Workload="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004133a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-66fd84fdb4-cwm4r", "timestamp":"2025-02-13 19:53:33.881853714 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.900 [INFO][4895] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.918 [INFO][4895] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.918 [INFO][4895] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.920 [INFO][4895] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.924 [INFO][4895] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.928 [INFO][4895] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.931 [INFO][4895] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.936 [INFO][4895] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.937 [INFO][4895] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.938 [INFO][4895] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071 Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.943 [INFO][4895] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.947 [INFO][4895] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.947 [INFO][4895] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" host="localhost" Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.947 [INFO][4895] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:53:33.974775 containerd[1536]: 2025-02-13 19:53:33.948 [INFO][4895] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" HandleID="k8s-pod-network.2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Workload="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" Feb 13 19:53:33.976143 containerd[1536]: 2025-02-13 19:53:33.952 [INFO][4849] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0", GenerateName:"calico-apiserver-66fd84fdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"9f3ee88c-5d3d-4b58-85b2-38498875ee0d", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66fd84fdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-66fd84fdb4-cwm4r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali21785a600c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:33.976143 containerd[1536]: 2025-02-13 19:53:33.952 [INFO][4849] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" Feb 13 19:53:33.976143 containerd[1536]: 2025-02-13 19:53:33.952 [INFO][4849] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali21785a600c3 ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" Feb 13 19:53:33.976143 containerd[1536]: 2025-02-13 19:53:33.956 [INFO][4849] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" Feb 13 19:53:33.976143 containerd[1536]: 2025-02-13 19:53:33.960 [INFO][4849] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0", GenerateName:"calico-apiserver-66fd84fdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"9f3ee88c-5d3d-4b58-85b2-38498875ee0d", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66fd84fdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071", Pod:"calico-apiserver-66fd84fdb4-cwm4r", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali21785a600c3", MAC:"da:a1:21:fa:ed:a1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:33.976143 containerd[1536]: 2025-02-13 19:53:33.972 [INFO][4849] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-cwm4r" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--cwm4r-eth0" Feb 13 19:53:33.984628 systemd-networkd[1445]: cali18b4e24f478: Link UP Feb 13 19:53:33.986042 systemd-networkd[1445]: cali18b4e24f478: Gained carrier Feb 13 19:53:33.993196 containerd[1536]: time="2025-02-13T19:53:33.993114635Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:33.993196 containerd[1536]: time="2025-02-13T19:53:33.993159546Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:33.994298 containerd[1536]: time="2025-02-13T19:53:33.993171382Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:33.994453 containerd[1536]: time="2025-02-13T19:53:33.993375965Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.654 [INFO][4864] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.685 [INFO][4864] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0 coredns-7db6d8ff4d- kube-system 9644224c-d01a-42e1-9f2e-df8377e29c31 684 0 2025-02-13 19:53:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-rknrr eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali18b4e24f478 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.685 [INFO][4864] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.882 [INFO][4894] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" HandleID="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Workload="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.900 [INFO][4894] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" HandleID="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Workload="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000412b90), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-rknrr", "timestamp":"2025-02-13 19:53:33.882709408 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.900 [INFO][4894] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.948 [INFO][4894] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.948 [INFO][4894] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.949 [INFO][4894] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.955 [INFO][4894] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.960 [INFO][4894] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.965 [INFO][4894] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.970 [INFO][4894] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.971 [INFO][4894] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.972 [INFO][4894] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178 Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.975 [INFO][4894] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.979 [INFO][4894] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.979 [INFO][4894] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" host="localhost" Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.980 [INFO][4894] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:53:33.996054 containerd[1536]: 2025-02-13 19:53:33.980 [INFO][4894] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" HandleID="k8s-pod-network.4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Workload="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" Feb 13 19:53:33.996773 containerd[1536]: 2025-02-13 19:53:33.982 [INFO][4864] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"9644224c-d01a-42e1-9f2e-df8377e29c31", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-rknrr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali18b4e24f478", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:33.996773 containerd[1536]: 2025-02-13 19:53:33.982 [INFO][4864] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" Feb 13 19:53:33.996773 containerd[1536]: 2025-02-13 19:53:33.982 [INFO][4864] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali18b4e24f478 ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" Feb 13 19:53:33.996773 containerd[1536]: 2025-02-13 19:53:33.983 [INFO][4864] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" Feb 13 19:53:33.996773 containerd[1536]: 2025-02-13 19:53:33.984 [INFO][4864] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"9644224c-d01a-42e1-9f2e-df8377e29c31", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178", Pod:"coredns-7db6d8ff4d-rknrr", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali18b4e24f478", MAC:"ee:97:6d:05:0c:ee", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:33.996773 containerd[1536]: 2025-02-13 19:53:33.994 [INFO][4864] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178" Namespace="kube-system" Pod="coredns-7db6d8ff4d-rknrr" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--rknrr-eth0" Feb 13 19:53:34.015119 systemd[1]: Started cri-containerd-3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311.scope - libcontainer container 3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311. Feb 13 19:53:34.024161 systemd-networkd[1445]: calibd7ce0f6eff: Link UP Feb 13 19:53:34.025179 systemd-networkd[1445]: calibd7ce0f6eff: Gained carrier Feb 13 19:53:34.029764 containerd[1536]: time="2025-02-13T19:53:34.029167074Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:34.029764 containerd[1536]: time="2025-02-13T19:53:34.029281497Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:34.029764 containerd[1536]: time="2025-02-13T19:53:34.029310577Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.029764 containerd[1536]: time="2025-02-13T19:53:34.029402279Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.042374 systemd-resolved[1446]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:53:34.044293 containerd[1536]: time="2025-02-13T19:53:34.044085151Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:34.044831 containerd[1536]: time="2025-02-13T19:53:34.044232092Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:34.044831 containerd[1536]: time="2025-02-13T19:53:34.044803511Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.045911 containerd[1536]: time="2025-02-13T19:53:34.045882922Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.053154 systemd[1]: Started cri-containerd-2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071.scope - libcontainer container 2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071. Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.412 [INFO][4829] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.645 [INFO][4829] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0 calico-kube-controllers-855555987- calico-system 9f58da82-3497-4839-8c99-878960089137 685 0 2025-02-13 19:53:13 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:855555987 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-855555987-xzhkk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibd7ce0f6eff [] []}} ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.645 [INFO][4829] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.881 [INFO][4892] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" HandleID="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Workload="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.901 [INFO][4892] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" HandleID="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Workload="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050400), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-855555987-xzhkk", "timestamp":"2025-02-13 19:53:33.881791799 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.901 [INFO][4892] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.979 [INFO][4892] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.979 [INFO][4892] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.981 [INFO][4892] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.988 [INFO][4892] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:33.998 [INFO][4892] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.000 [INFO][4892] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.002 [INFO][4892] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.002 [INFO][4892] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.003 [INFO][4892] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.008 [INFO][4892] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.014 [INFO][4892] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.014 [INFO][4892] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" host="localhost" Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.014 [INFO][4892] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:53:34.063287 containerd[1536]: 2025-02-13 19:53:34.014 [INFO][4892] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" HandleID="k8s-pod-network.67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Workload="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" Feb 13 19:53:34.063776 containerd[1536]: 2025-02-13 19:53:34.018 [INFO][4829] cni-plugin/k8s.go 386: Populated endpoint ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0", GenerateName:"calico-kube-controllers-855555987-", Namespace:"calico-system", SelfLink:"", UID:"9f58da82-3497-4839-8c99-878960089137", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"855555987", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-855555987-xzhkk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibd7ce0f6eff", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:34.063776 containerd[1536]: 2025-02-13 19:53:34.018 [INFO][4829] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" Feb 13 19:53:34.063776 containerd[1536]: 2025-02-13 19:53:34.019 [INFO][4829] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibd7ce0f6eff ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" Feb 13 19:53:34.063776 containerd[1536]: 2025-02-13 19:53:34.021 [INFO][4829] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" Feb 13 19:53:34.063776 containerd[1536]: 2025-02-13 19:53:34.021 [INFO][4829] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0", GenerateName:"calico-kube-controllers-855555987-", Namespace:"calico-system", SelfLink:"", UID:"9f58da82-3497-4839-8c99-878960089137", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 13, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"855555987", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d", Pod:"calico-kube-controllers-855555987-xzhkk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibd7ce0f6eff", MAC:"f2:11:2f:c5:5b:b7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:34.063776 containerd[1536]: 2025-02-13 19:53:34.052 [INFO][4829] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d" Namespace="calico-system" Pod="calico-kube-controllers-855555987-xzhkk" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--855555987--xzhkk-eth0" Feb 13 19:53:34.099075 systemd[1]: Started cri-containerd-4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178.scope - libcontainer container 4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178. Feb 13 19:53:34.103333 systemd-networkd[1445]: cali32041ce7852: Link UP Feb 13 19:53:34.104581 containerd[1536]: time="2025-02-13T19:53:34.104337127Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-kzqv2,Uid:34af207b-f303-480e-a684-e96e850daafd,Namespace:calico-system,Attempt:6,} returns sandbox id \"3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311\"" Feb 13 19:53:34.107027 systemd-networkd[1445]: cali32041ce7852: Gained carrier Feb 13 19:53:34.120021 containerd[1536]: time="2025-02-13T19:53:34.119210343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Feb 13 19:53:34.124260 containerd[1536]: time="2025-02-13T19:53:34.124067242Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:34.124260 containerd[1536]: time="2025-02-13T19:53:34.124133453Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:34.124260 containerd[1536]: time="2025-02-13T19:53:34.124144630Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.124260 containerd[1536]: time="2025-02-13T19:53:34.124231374Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:33.409 [INFO][4819] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:33.644 [INFO][4819] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0 coredns-7db6d8ff4d- kube-system e2117422-eaf7-4842-8ba3-82e570607dc9 686 0 2025-02-13 19:53:07 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-t5g9g eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali32041ce7852 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:33.644 [INFO][4819] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:33.881 [INFO][4893] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" HandleID="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Workload="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:33.901 [INFO][4893] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" HandleID="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Workload="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003c28d0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-t5g9g", "timestamp":"2025-02-13 19:53:33.881791216 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:33.901 [INFO][4893] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.014 [INFO][4893] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.014 [INFO][4893] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.016 [INFO][4893] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.020 [INFO][4893] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.033 [INFO][4893] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.051 [INFO][4893] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.057 [INFO][4893] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.058 [INFO][4893] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.066 [INFO][4893] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724 Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.077 [INFO][4893] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.088 [INFO][4893] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.088 [INFO][4893] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" host="localhost" Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.088 [INFO][4893] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:53:34.130498 containerd[1536]: 2025-02-13 19:53:34.088 [INFO][4893] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" HandleID="k8s-pod-network.185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Workload="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" Feb 13 19:53:34.130928 containerd[1536]: 2025-02-13 19:53:34.098 [INFO][4819] cni-plugin/k8s.go 386: Populated endpoint ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"e2117422-eaf7-4842-8ba3-82e570607dc9", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-t5g9g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali32041ce7852", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:34.130928 containerd[1536]: 2025-02-13 19:53:34.098 [INFO][4819] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" Feb 13 19:53:34.130928 containerd[1536]: 2025-02-13 19:53:34.098 [INFO][4819] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali32041ce7852 ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" Feb 13 19:53:34.130928 containerd[1536]: 2025-02-13 19:53:34.106 [INFO][4819] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" Feb 13 19:53:34.130928 containerd[1536]: 2025-02-13 19:53:34.107 [INFO][4819] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"e2117422-eaf7-4842-8ba3-82e570607dc9", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724", Pod:"coredns-7db6d8ff4d-t5g9g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali32041ce7852", MAC:"36:26:7a:f5:93:36", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:34.130928 containerd[1536]: 2025-02-13 19:53:34.128 [INFO][4819] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724" Namespace="kube-system" Pod="coredns-7db6d8ff4d-t5g9g" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--t5g9g-eth0" Feb 13 19:53:34.138184 systemd-resolved[1446]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:53:34.150119 systemd[1]: Started cri-containerd-67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d.scope - libcontainer container 67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d. Feb 13 19:53:34.173938 containerd[1536]: time="2025-02-13T19:53:34.173871149Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:34.174051 containerd[1536]: time="2025-02-13T19:53:34.174020942Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:34.174051 containerd[1536]: time="2025-02-13T19:53:34.174037318Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.174228 containerd[1536]: time="2025-02-13T19:53:34.174196805Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.177139 systemd-resolved[1446]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:53:34.190658 systemd-networkd[1445]: cali001e79b59c8: Link UP Feb 13 19:53:34.192884 systemd[1]: Started cri-containerd-185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724.scope - libcontainer container 185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724. Feb 13 19:53:34.193879 systemd-networkd[1445]: cali001e79b59c8: Gained carrier Feb 13 19:53:34.211371 systemd-resolved[1446]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:33.658 [INFO][4871] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:33.685 [INFO][4871] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0 calico-apiserver-66fd84fdb4- calico-apiserver b7841c7a-d63f-4929-a2cc-5262cd7ba254 688 0 2025-02-13 19:53:12 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:66fd84fdb4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-66fd84fdb4-2lrbj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali001e79b59c8 [] []}} ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:33.685 [INFO][4871] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:33.881 [INFO][4896] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" HandleID="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Workload="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:33.901 [INFO][4896] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" HandleID="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Workload="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c4e50), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-66fd84fdb4-2lrbj", "timestamp":"2025-02-13 19:53:33.881841999 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:33.901 [INFO][4896] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.090 [INFO][4896] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.090 [INFO][4896] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.093 [INFO][4896] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.112 [INFO][4896] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.132 [INFO][4896] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.137 [INFO][4896] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.147 [INFO][4896] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.147 [INFO][4896] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.151 [INFO][4896] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314 Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.163 [INFO][4896] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.178 [INFO][4896] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.179 [INFO][4896] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" host="localhost" Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.179 [INFO][4896] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Feb 13 19:53:34.225715 containerd[1536]: 2025-02-13 19:53:34.179 [INFO][4896] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" HandleID="k8s-pod-network.6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Workload="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" Feb 13 19:53:34.226703 containerd[1536]: 2025-02-13 19:53:34.184 [INFO][4871] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0", GenerateName:"calico-apiserver-66fd84fdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"b7841c7a-d63f-4929-a2cc-5262cd7ba254", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66fd84fdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-66fd84fdb4-2lrbj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali001e79b59c8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:34.226703 containerd[1536]: 2025-02-13 19:53:34.185 [INFO][4871] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" Feb 13 19:53:34.226703 containerd[1536]: 2025-02-13 19:53:34.185 [INFO][4871] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali001e79b59c8 ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" Feb 13 19:53:34.226703 containerd[1536]: 2025-02-13 19:53:34.193 [INFO][4871] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" Feb 13 19:53:34.226703 containerd[1536]: 2025-02-13 19:53:34.195 [INFO][4871] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0", GenerateName:"calico-apiserver-66fd84fdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"b7841c7a-d63f-4929-a2cc-5262cd7ba254", ResourceVersion:"688", Generation:0, CreationTimestamp:time.Date(2025, time.February, 13, 19, 53, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"66fd84fdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314", Pod:"calico-apiserver-66fd84fdb4-2lrbj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali001e79b59c8", MAC:"76:48:af:75:0f:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Feb 13 19:53:34.226703 containerd[1536]: 2025-02-13 19:53:34.213 [INFO][4871] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314" Namespace="calico-apiserver" Pod="calico-apiserver-66fd84fdb4-2lrbj" WorkloadEndpoint="localhost-k8s-calico--apiserver--66fd84fdb4--2lrbj-eth0" Feb 13 19:53:34.227081 systemd-resolved[1446]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:53:34.239894 containerd[1536]: time="2025-02-13T19:53:34.239824786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-rknrr,Uid:9644224c-d01a-42e1-9f2e-df8377e29c31,Namespace:kube-system,Attempt:6,} returns sandbox id \"4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178\"" Feb 13 19:53:34.243835 containerd[1536]: time="2025-02-13T19:53:34.243753782Z" level=info msg="CreateContainer within sandbox \"4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 19:53:34.261230 containerd[1536]: time="2025-02-13T19:53:34.260798752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-t5g9g,Uid:e2117422-eaf7-4842-8ba3-82e570607dc9,Namespace:kube-system,Attempt:6,} returns sandbox id \"185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724\"" Feb 13 19:53:34.267302 containerd[1536]: time="2025-02-13T19:53:34.266586145Z" level=info msg="CreateContainer within sandbox \"185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Feb 13 19:53:34.281297 containerd[1536]: time="2025-02-13T19:53:34.274062520Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Feb 13 19:53:34.281297 containerd[1536]: time="2025-02-13T19:53:34.274397937Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Feb 13 19:53:34.281297 containerd[1536]: time="2025-02-13T19:53:34.274411568Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.281297 containerd[1536]: time="2025-02-13T19:53:34.276283141Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Feb 13 19:53:34.301092 systemd[1]: Started cri-containerd-6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314.scope - libcontainer container 6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314. Feb 13 19:53:34.309744 containerd[1536]: time="2025-02-13T19:53:34.309689077Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-855555987-xzhkk,Uid:9f58da82-3497-4839-8c99-878960089137,Namespace:calico-system,Attempt:6,} returns sandbox id \"67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d\"" Feb 13 19:53:34.313072 containerd[1536]: time="2025-02-13T19:53:34.312556892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-cwm4r,Uid:9f3ee88c-5d3d-4b58-85b2-38498875ee0d,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071\"" Feb 13 19:53:34.322041 systemd-resolved[1446]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Feb 13 19:53:34.331578 containerd[1536]: time="2025-02-13T19:53:34.331499960Z" level=info msg="CreateContainer within sandbox \"185b08a03754f0462bcd8d21da516842a329e949ceb14cf16a067fcfe2f2c724\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e2e7c9e05f52772aaab6c2c5d58bf13cfbdd66cb41ae09aed22923e71819059d\"" Feb 13 19:53:34.332049 containerd[1536]: time="2025-02-13T19:53:34.332029006Z" level=info msg="StartContainer for \"e2e7c9e05f52772aaab6c2c5d58bf13cfbdd66cb41ae09aed22923e71819059d\"" Feb 13 19:53:34.336708 containerd[1536]: time="2025-02-13T19:53:34.336398659Z" level=info msg="CreateContainer within sandbox \"4d6e1b6f7e2b0010d52545fd063a239e53c35a51cf77ffdf2ff111d25e647178\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e8e97f23e120fe49c0bc5e4efe3ccef48925df1edf82d2ea980c5ac120fab219\"" Feb 13 19:53:34.337183 containerd[1536]: time="2025-02-13T19:53:34.337055745Z" level=info msg="StartContainer for \"e8e97f23e120fe49c0bc5e4efe3ccef48925df1edf82d2ea980c5ac120fab219\"" Feb 13 19:53:34.366453 systemd[1]: Started cri-containerd-e2e7c9e05f52772aaab6c2c5d58bf13cfbdd66cb41ae09aed22923e71819059d.scope - libcontainer container e2e7c9e05f52772aaab6c2c5d58bf13cfbdd66cb41ae09aed22923e71819059d. Feb 13 19:53:34.378617 systemd[1]: Started cri-containerd-e8e97f23e120fe49c0bc5e4efe3ccef48925df1edf82d2ea980c5ac120fab219.scope - libcontainer container e8e97f23e120fe49c0bc5e4efe3ccef48925df1edf82d2ea980c5ac120fab219. Feb 13 19:53:34.407995 containerd[1536]: time="2025-02-13T19:53:34.407563362Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-66fd84fdb4-2lrbj,Uid:b7841c7a-d63f-4929-a2cc-5262cd7ba254,Namespace:calico-apiserver,Attempt:6,} returns sandbox id \"6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314\"" Feb 13 19:53:34.452185 containerd[1536]: time="2025-02-13T19:53:34.452113718Z" level=info msg="StartContainer for \"e8e97f23e120fe49c0bc5e4efe3ccef48925df1edf82d2ea980c5ac120fab219\" returns successfully" Feb 13 19:53:34.462049 containerd[1536]: time="2025-02-13T19:53:34.462001346Z" level=info msg="StartContainer for \"e2e7c9e05f52772aaab6c2c5d58bf13cfbdd66cb41ae09aed22923e71819059d\" returns successfully" Feb 13 19:53:34.469743 kubelet[2855]: I0213 19:53:34.469711 2855 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:53:34.669000 kernel: bpftool[5433]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Feb 13 19:53:34.823239 systemd-networkd[1445]: vxlan.calico: Link UP Feb 13 19:53:34.823245 systemd-networkd[1445]: vxlan.calico: Gained carrier Feb 13 19:53:35.330106 systemd-networkd[1445]: calibd7ce0f6eff: Gained IPv6LL Feb 13 19:53:35.483611 kubelet[2855]: I0213 19:53:35.483426 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-rknrr" podStartSLOduration=28.483414369 podStartE2EDuration="28.483414369s" podCreationTimestamp="2025-02-13 19:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:53:34.571077207 +0000 UTC m=+43.133711617" watchObservedRunningTime="2025-02-13 19:53:35.483414369 +0000 UTC m=+44.046048765" Feb 13 19:53:35.500794 kubelet[2855]: I0213 19:53:35.500739 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-t5g9g" podStartSLOduration=28.500725634 podStartE2EDuration="28.500725634s" podCreationTimestamp="2025-02-13 19:53:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-02-13 19:53:35.498333303 +0000 UTC m=+44.060967710" watchObservedRunningTime="2025-02-13 19:53:35.500725634 +0000 UTC m=+44.063360029" Feb 13 19:53:35.523294 systemd-networkd[1445]: cali0e85d43a35e: Gained IPv6LL Feb 13 19:53:35.523470 systemd-networkd[1445]: cali18b4e24f478: Gained IPv6LL Feb 13 19:53:35.586172 systemd-networkd[1445]: cali21785a600c3: Gained IPv6LL Feb 13 19:53:35.692258 containerd[1536]: time="2025-02-13T19:53:35.692231550Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:35.692755 containerd[1536]: time="2025-02-13T19:53:35.692710343Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Feb 13 19:53:35.693333 containerd[1536]: time="2025-02-13T19:53:35.693012646Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:35.694894 containerd[1536]: time="2025-02-13T19:53:35.694870660Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:35.695287 containerd[1536]: time="2025-02-13T19:53:35.695270464Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.576039332s" Feb 13 19:53:35.695320 containerd[1536]: time="2025-02-13T19:53:35.695288337Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Feb 13 19:53:35.696057 containerd[1536]: time="2025-02-13T19:53:35.696042450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Feb 13 19:53:35.697417 containerd[1536]: time="2025-02-13T19:53:35.697404887Z" level=info msg="CreateContainer within sandbox \"3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Feb 13 19:53:35.707462 containerd[1536]: time="2025-02-13T19:53:35.707433943Z" level=info msg="CreateContainer within sandbox \"3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9cbe03e0d01e232afb177fe29c2ee81f15327ddb1737d9d2746a27aa24eca24a\"" Feb 13 19:53:35.708093 containerd[1536]: time="2025-02-13T19:53:35.708045711Z" level=info msg="StartContainer for \"9cbe03e0d01e232afb177fe29c2ee81f15327ddb1737d9d2746a27aa24eca24a\"" Feb 13 19:53:35.727315 systemd[1]: run-containerd-runc-k8s.io-9cbe03e0d01e232afb177fe29c2ee81f15327ddb1737d9d2746a27aa24eca24a-runc.nNdHJN.mount: Deactivated successfully. Feb 13 19:53:35.732065 systemd[1]: Started cri-containerd-9cbe03e0d01e232afb177fe29c2ee81f15327ddb1737d9d2746a27aa24eca24a.scope - libcontainer container 9cbe03e0d01e232afb177fe29c2ee81f15327ddb1737d9d2746a27aa24eca24a. Feb 13 19:53:35.750616 containerd[1536]: time="2025-02-13T19:53:35.750530520Z" level=info msg="StartContainer for \"9cbe03e0d01e232afb177fe29c2ee81f15327ddb1737d9d2746a27aa24eca24a\" returns successfully" Feb 13 19:53:35.778119 systemd-networkd[1445]: cali001e79b59c8: Gained IPv6LL Feb 13 19:53:35.970226 systemd-networkd[1445]: cali32041ce7852: Gained IPv6LL Feb 13 19:53:36.098082 systemd-networkd[1445]: vxlan.calico: Gained IPv6LL Feb 13 19:53:37.696022 containerd[1536]: time="2025-02-13T19:53:37.695382302Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:37.699238 containerd[1536]: time="2025-02-13T19:53:37.699208869Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Feb 13 19:53:37.706604 containerd[1536]: time="2025-02-13T19:53:37.706570726Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:37.717871 containerd[1536]: time="2025-02-13T19:53:37.717839441Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:37.718373 containerd[1536]: time="2025-02-13T19:53:37.718251239Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.022189323s" Feb 13 19:53:37.718373 containerd[1536]: time="2025-02-13T19:53:37.718275165Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Feb 13 19:53:37.719434 containerd[1536]: time="2025-02-13T19:53:37.719374059Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 19:53:37.778030 containerd[1536]: time="2025-02-13T19:53:37.777846042Z" level=info msg="CreateContainer within sandbox \"67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Feb 13 19:53:37.785842 containerd[1536]: time="2025-02-13T19:53:37.785817187Z" level=info msg="CreateContainer within sandbox \"67fc19fb8e2415705ea53dfbc6e125bbd0aa34ba048048ec4dbc7ab3799e0b7d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"146b0c6d2dc88f315926b779749a5cb6b91d0754303b6b1781f86a489f6b1fb2\"" Feb 13 19:53:37.786155 containerd[1536]: time="2025-02-13T19:53:37.786135240Z" level=info msg="StartContainer for \"146b0c6d2dc88f315926b779749a5cb6b91d0754303b6b1781f86a489f6b1fb2\"" Feb 13 19:53:37.815092 systemd[1]: Started cri-containerd-146b0c6d2dc88f315926b779749a5cb6b91d0754303b6b1781f86a489f6b1fb2.scope - libcontainer container 146b0c6d2dc88f315926b779749a5cb6b91d0754303b6b1781f86a489f6b1fb2. Feb 13 19:53:37.847065 containerd[1536]: time="2025-02-13T19:53:37.847032837Z" level=info msg="StartContainer for \"146b0c6d2dc88f315926b779749a5cb6b91d0754303b6b1781f86a489f6b1fb2\" returns successfully" Feb 13 19:53:38.530843 kubelet[2855]: I0213 19:53:38.530643 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-855555987-xzhkk" podStartSLOduration=22.123618312 podStartE2EDuration="25.530621198s" podCreationTimestamp="2025-02-13 19:53:13 +0000 UTC" firstStartedPulling="2025-02-13 19:53:34.312162868 +0000 UTC m=+42.874797261" lastFinishedPulling="2025-02-13 19:53:37.719165748 +0000 UTC m=+46.281800147" observedRunningTime="2025-02-13 19:53:38.529573897 +0000 UTC m=+47.092208299" watchObservedRunningTime="2025-02-13 19:53:38.530621198 +0000 UTC m=+47.093255600" Feb 13 19:53:40.163266 containerd[1536]: time="2025-02-13T19:53:40.163192816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:40.163266 containerd[1536]: time="2025-02-13T19:53:40.163230372Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Feb 13 19:53:40.165737 containerd[1536]: time="2025-02-13T19:53:40.164090504Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:40.165737 containerd[1536]: time="2025-02-13T19:53:40.165443713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:40.165914 containerd[1536]: time="2025-02-13T19:53:40.165809490Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.446414689s" Feb 13 19:53:40.165914 containerd[1536]: time="2025-02-13T19:53:40.165827410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 19:53:40.167175 containerd[1536]: time="2025-02-13T19:53:40.166819624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Feb 13 19:53:40.168465 containerd[1536]: time="2025-02-13T19:53:40.168444304Z" level=info msg="CreateContainer within sandbox \"2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 19:53:40.191781 containerd[1536]: time="2025-02-13T19:53:40.191700128Z" level=info msg="CreateContainer within sandbox \"2be3a6e6a6afa8e20e59d0e2a50ce40e97ce20863d89a0412f8b22518d44c071\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"67dda94388823872fe0abed84e18a2345fee3553831d06b72d7023c9a353502f\"" Feb 13 19:53:40.193596 containerd[1536]: time="2025-02-13T19:53:40.192094697Z" level=info msg="StartContainer for \"67dda94388823872fe0abed84e18a2345fee3553831d06b72d7023c9a353502f\"" Feb 13 19:53:40.235102 systemd[1]: Started cri-containerd-67dda94388823872fe0abed84e18a2345fee3553831d06b72d7023c9a353502f.scope - libcontainer container 67dda94388823872fe0abed84e18a2345fee3553831d06b72d7023c9a353502f. Feb 13 19:53:40.267644 containerd[1536]: time="2025-02-13T19:53:40.267615946Z" level=info msg="StartContainer for \"67dda94388823872fe0abed84e18a2345fee3553831d06b72d7023c9a353502f\" returns successfully" Feb 13 19:53:40.524218 kubelet[2855]: I0213 19:53:40.523667 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-66fd84fdb4-cwm4r" podStartSLOduration=22.671092894 podStartE2EDuration="28.523655539s" podCreationTimestamp="2025-02-13 19:53:12 +0000 UTC" firstStartedPulling="2025-02-13 19:53:34.313903135 +0000 UTC m=+42.876537529" lastFinishedPulling="2025-02-13 19:53:40.16646578 +0000 UTC m=+48.729100174" observedRunningTime="2025-02-13 19:53:40.523029784 +0000 UTC m=+49.085664193" watchObservedRunningTime="2025-02-13 19:53:40.523655539 +0000 UTC m=+49.086289941" Feb 13 19:53:40.637583 containerd[1536]: time="2025-02-13T19:53:40.637551239Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:40.637896 containerd[1536]: time="2025-02-13T19:53:40.637789512Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Feb 13 19:53:40.639201 containerd[1536]: time="2025-02-13T19:53:40.639181594Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 472.341719ms" Feb 13 19:53:40.639252 containerd[1536]: time="2025-02-13T19:53:40.639203061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Feb 13 19:53:40.640199 containerd[1536]: time="2025-02-13T19:53:40.639809936Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Feb 13 19:53:40.641350 containerd[1536]: time="2025-02-13T19:53:40.641331902Z" level=info msg="CreateContainer within sandbox \"6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Feb 13 19:53:40.662391 containerd[1536]: time="2025-02-13T19:53:40.662277268Z" level=info msg="CreateContainer within sandbox \"6fa75d23cee7411f880511e577dbce5259caa16a3ea386ffab2ac51d183b1314\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"511ca1d60e2f87b6a2c4320875b5e2a75b0b92e51f8af6ee4b728f8e4f26924f\"" Feb 13 19:53:40.663122 containerd[1536]: time="2025-02-13T19:53:40.663072061Z" level=info msg="StartContainer for \"511ca1d60e2f87b6a2c4320875b5e2a75b0b92e51f8af6ee4b728f8e4f26924f\"" Feb 13 19:53:40.688206 systemd[1]: Started cri-containerd-511ca1d60e2f87b6a2c4320875b5e2a75b0b92e51f8af6ee4b728f8e4f26924f.scope - libcontainer container 511ca1d60e2f87b6a2c4320875b5e2a75b0b92e51f8af6ee4b728f8e4f26924f. Feb 13 19:53:40.736011 containerd[1536]: time="2025-02-13T19:53:40.735969297Z" level=info msg="StartContainer for \"511ca1d60e2f87b6a2c4320875b5e2a75b0b92e51f8af6ee4b728f8e4f26924f\" returns successfully" Feb 13 19:53:41.522266 kubelet[2855]: I0213 19:53:41.522241 2855 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:53:41.528916 kubelet[2855]: I0213 19:53:41.528033 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-66fd84fdb4-2lrbj" podStartSLOduration=23.313447639 podStartE2EDuration="29.528022141s" podCreationTimestamp="2025-02-13 19:53:12 +0000 UTC" firstStartedPulling="2025-02-13 19:53:34.425175213 +0000 UTC m=+42.987809607" lastFinishedPulling="2025-02-13 19:53:40.639749715 +0000 UTC m=+49.202384109" observedRunningTime="2025-02-13 19:53:41.527018129 +0000 UTC m=+50.089652538" watchObservedRunningTime="2025-02-13 19:53:41.528022141 +0000 UTC m=+50.090656536" Feb 13 19:53:42.493356 containerd[1536]: time="2025-02-13T19:53:42.493250713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:42.494425 containerd[1536]: time="2025-02-13T19:53:42.494381049Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Feb 13 19:53:42.495034 containerd[1536]: time="2025-02-13T19:53:42.495004427Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:42.496481 containerd[1536]: time="2025-02-13T19:53:42.496453906Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Feb 13 19:53:42.497125 containerd[1536]: time="2025-02-13T19:53:42.496744467Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 1.8565864s" Feb 13 19:53:42.497125 containerd[1536]: time="2025-02-13T19:53:42.496766201Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Feb 13 19:53:42.498532 containerd[1536]: time="2025-02-13T19:53:42.498502311Z" level=info msg="CreateContainer within sandbox \"3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Feb 13 19:53:42.516422 containerd[1536]: time="2025-02-13T19:53:42.516390938Z" level=info msg="CreateContainer within sandbox \"3f849bd61959caaadea9330a25e83eaca01fe762fa7e206180123ff9f9206311\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"20bef42ed119e2bf9c970b8567fcf138b7ec9394c18c07bec691d96b966a1a5e\"" Feb 13 19:53:42.516875 containerd[1536]: time="2025-02-13T19:53:42.516767388Z" level=info msg="StartContainer for \"20bef42ed119e2bf9c970b8567fcf138b7ec9394c18c07bec691d96b966a1a5e\"" Feb 13 19:53:42.530219 kubelet[2855]: I0213 19:53:42.530192 2855 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:53:42.549134 systemd[1]: Started cri-containerd-20bef42ed119e2bf9c970b8567fcf138b7ec9394c18c07bec691d96b966a1a5e.scope - libcontainer container 20bef42ed119e2bf9c970b8567fcf138b7ec9394c18c07bec691d96b966a1a5e. Feb 13 19:53:42.574503 containerd[1536]: time="2025-02-13T19:53:42.574474057Z" level=info msg="StartContainer for \"20bef42ed119e2bf9c970b8567fcf138b7ec9394c18c07bec691d96b966a1a5e\" returns successfully" Feb 13 19:53:42.983707 kubelet[2855]: I0213 19:53:42.983565 2855 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Feb 13 19:53:42.988628 kubelet[2855]: I0213 19:53:42.988565 2855 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Feb 13 19:53:51.573758 containerd[1536]: time="2025-02-13T19:53:51.573558604Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:51.585856 containerd[1536]: time="2025-02-13T19:53:51.573625819Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:51.585996 containerd[1536]: time="2025-02-13T19:53:51.585938684Z" level=info msg="StopPodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:51.610286 containerd[1536]: time="2025-02-13T19:53:51.610210954Z" level=info msg="RemovePodSandbox for \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:51.618115 containerd[1536]: time="2025-02-13T19:53:51.616883295Z" level=info msg="Forcibly stopping sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\"" Feb 13 19:53:51.623057 containerd[1536]: time="2025-02-13T19:53:51.616950921Z" level=info msg="TearDown network for sandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" successfully" Feb 13 19:53:51.640904 containerd[1536]: time="2025-02-13T19:53:51.640879417Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.655460 containerd[1536]: time="2025-02-13T19:53:51.655440382Z" level=info msg="RemovePodSandbox \"113fab3a0519437e8fa2204404ae7ced8a5d362d9a2fea736f7b958a3dc4fe4a\" returns successfully" Feb 13 19:53:51.655769 containerd[1536]: time="2025-02-13T19:53:51.655640028Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:51.655769 containerd[1536]: time="2025-02-13T19:53:51.655685051Z" level=info msg="TearDown network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" successfully" Feb 13 19:53:51.655769 containerd[1536]: time="2025-02-13T19:53:51.655691674Z" level=info msg="StopPodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" returns successfully" Feb 13 19:53:51.655962 containerd[1536]: time="2025-02-13T19:53:51.655909766Z" level=info msg="RemovePodSandbox for \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:51.655962 containerd[1536]: time="2025-02-13T19:53:51.655921677Z" level=info msg="Forcibly stopping sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\"" Feb 13 19:53:51.656424 containerd[1536]: time="2025-02-13T19:53:51.656219505Z" level=info msg="TearDown network for sandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" successfully" Feb 13 19:53:51.658797 containerd[1536]: time="2025-02-13T19:53:51.658639462Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.658797 containerd[1536]: time="2025-02-13T19:53:51.658660720Z" level=info msg="RemovePodSandbox \"cddc966d73fe937c9a677643b04338f11bd15529fb61377d1904e846201b657a\" returns successfully" Feb 13 19:53:51.659365 containerd[1536]: time="2025-02-13T19:53:51.659091260Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" Feb 13 19:53:51.659365 containerd[1536]: time="2025-02-13T19:53:51.659134615Z" level=info msg="TearDown network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" successfully" Feb 13 19:53:51.659365 containerd[1536]: time="2025-02-13T19:53:51.659140582Z" level=info msg="StopPodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" returns successfully" Feb 13 19:53:51.660006 containerd[1536]: time="2025-02-13T19:53:51.659570657Z" level=info msg="RemovePodSandbox for \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" Feb 13 19:53:51.660006 containerd[1536]: time="2025-02-13T19:53:51.659583541Z" level=info msg="Forcibly stopping sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\"" Feb 13 19:53:51.660006 containerd[1536]: time="2025-02-13T19:53:51.659616502Z" level=info msg="TearDown network for sandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" successfully" Feb 13 19:53:51.660946 containerd[1536]: time="2025-02-13T19:53:51.660931031Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.660991 containerd[1536]: time="2025-02-13T19:53:51.660953328Z" level=info msg="RemovePodSandbox \"1dd9774ba27ba9581cca1de82af50f9820fa8bceb47766955c9bf99575ad1f9c\" returns successfully" Feb 13 19:53:51.661099 containerd[1536]: time="2025-02-13T19:53:51.661086507Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\"" Feb 13 19:53:51.661139 containerd[1536]: time="2025-02-13T19:53:51.661128916Z" level=info msg="TearDown network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" successfully" Feb 13 19:53:51.661341 containerd[1536]: time="2025-02-13T19:53:51.661141037Z" level=info msg="StopPodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" returns successfully" Feb 13 19:53:51.661341 containerd[1536]: time="2025-02-13T19:53:51.661285697Z" level=info msg="RemovePodSandbox for \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\"" Feb 13 19:53:51.661341 containerd[1536]: time="2025-02-13T19:53:51.661297388Z" level=info msg="Forcibly stopping sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\"" Feb 13 19:53:51.661341 containerd[1536]: time="2025-02-13T19:53:51.661326214Z" level=info msg="TearDown network for sandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" successfully" Feb 13 19:53:51.664767 containerd[1536]: time="2025-02-13T19:53:51.664752066Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.664800 containerd[1536]: time="2025-02-13T19:53:51.664777166Z" level=info msg="RemovePodSandbox \"de9e443f3a00bb2e9c80d0db47d0cf01121692213f2a42830e79327af7a00375\" returns successfully" Feb 13 19:53:51.664970 containerd[1536]: time="2025-02-13T19:53:51.664955197Z" level=info msg="StopPodSandbox for \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\"" Feb 13 19:53:51.665035 containerd[1536]: time="2025-02-13T19:53:51.665022364Z" level=info msg="TearDown network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" successfully" Feb 13 19:53:51.665035 containerd[1536]: time="2025-02-13T19:53:51.665034094Z" level=info msg="StopPodSandbox for \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" returns successfully" Feb 13 19:53:51.665437 containerd[1536]: time="2025-02-13T19:53:51.665260816Z" level=info msg="RemovePodSandbox for \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\"" Feb 13 19:53:51.669302 containerd[1536]: time="2025-02-13T19:53:51.669253038Z" level=info msg="Forcibly stopping sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\"" Feb 13 19:53:51.669351 containerd[1536]: time="2025-02-13T19:53:51.669292208Z" level=info msg="TearDown network for sandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" successfully" Feb 13 19:53:51.670372 containerd[1536]: time="2025-02-13T19:53:51.670357261Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.670471 containerd[1536]: time="2025-02-13T19:53:51.670378897Z" level=info msg="RemovePodSandbox \"e6376b10b5e4635d57b7d201b1b18b6d403be7e8d4dd6a47dd9583ea7200d0f3\" returns successfully" Feb 13 19:53:51.670674 containerd[1536]: time="2025-02-13T19:53:51.670539605Z" level=info msg="StopPodSandbox for \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\"" Feb 13 19:53:51.670674 containerd[1536]: time="2025-02-13T19:53:51.670580323Z" level=info msg="TearDown network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\" successfully" Feb 13 19:53:51.670674 containerd[1536]: time="2025-02-13T19:53:51.670586434Z" level=info msg="StopPodSandbox for \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\" returns successfully" Feb 13 19:53:51.670834 containerd[1536]: time="2025-02-13T19:53:51.670802062Z" level=info msg="RemovePodSandbox for \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\"" Feb 13 19:53:51.670834 containerd[1536]: time="2025-02-13T19:53:51.670813929Z" level=info msg="Forcibly stopping sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\"" Feb 13 19:53:51.671305 containerd[1536]: time="2025-02-13T19:53:51.670934977Z" level=info msg="TearDown network for sandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\" successfully" Feb 13 19:53:51.672148 containerd[1536]: time="2025-02-13T19:53:51.672136163Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.672207 containerd[1536]: time="2025-02-13T19:53:51.672198372Z" level=info msg="RemovePodSandbox \"5dc8dee03b53fff56a188741949e1625950339be4527b3f36d3be7e7ef2f1fcc\" returns successfully" Feb 13 19:53:51.672366 containerd[1536]: time="2025-02-13T19:53:51.672356044Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:51.673536 containerd[1536]: time="2025-02-13T19:53:51.672551485Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:51.673536 containerd[1536]: time="2025-02-13T19:53:51.672558966Z" level=info msg="StopPodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:51.673536 containerd[1536]: time="2025-02-13T19:53:51.672730000Z" level=info msg="RemovePodSandbox for \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:51.673536 containerd[1536]: time="2025-02-13T19:53:51.672740021Z" level=info msg="Forcibly stopping sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\"" Feb 13 19:53:51.673536 containerd[1536]: time="2025-02-13T19:53:51.672769943Z" level=info msg="TearDown network for sandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" successfully" Feb 13 19:53:51.674156 containerd[1536]: time="2025-02-13T19:53:51.674103098Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.674156 containerd[1536]: time="2025-02-13T19:53:51.674123870Z" level=info msg="RemovePodSandbox \"ce98f763830a8e932fcc9eaa854521c7da2b463e23bc33e42e8d3b5f70fad9f5\" returns successfully" Feb 13 19:53:51.674342 containerd[1536]: time="2025-02-13T19:53:51.674304938Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:51.674545 containerd[1536]: time="2025-02-13T19:53:51.674444649Z" level=info msg="TearDown network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" successfully" Feb 13 19:53:51.674545 containerd[1536]: time="2025-02-13T19:53:51.674454620Z" level=info msg="StopPodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" returns successfully" Feb 13 19:53:51.675033 containerd[1536]: time="2025-02-13T19:53:51.674641051Z" level=info msg="RemovePodSandbox for \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:51.675033 containerd[1536]: time="2025-02-13T19:53:51.674653068Z" level=info msg="Forcibly stopping sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\"" Feb 13 19:53:51.675033 containerd[1536]: time="2025-02-13T19:53:51.674685153Z" level=info msg="TearDown network for sandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" successfully" Feb 13 19:53:51.675851 containerd[1536]: time="2025-02-13T19:53:51.675839058Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.675907 containerd[1536]: time="2025-02-13T19:53:51.675898865Z" level=info msg="RemovePodSandbox \"787855739ad49cd056737f7de5e39254a03b9e1ea901b77c6c5f57cc36a18621\" returns successfully" Feb 13 19:53:51.676091 containerd[1536]: time="2025-02-13T19:53:51.676077330Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" Feb 13 19:53:51.676132 containerd[1536]: time="2025-02-13T19:53:51.676119137Z" level=info msg="TearDown network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" successfully" Feb 13 19:53:51.676132 containerd[1536]: time="2025-02-13T19:53:51.676125406Z" level=info msg="StopPodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" returns successfully" Feb 13 19:53:51.676297 containerd[1536]: time="2025-02-13T19:53:51.676268747Z" level=info msg="RemovePodSandbox for \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" Feb 13 19:53:51.676338 containerd[1536]: time="2025-02-13T19:53:51.676300581Z" level=info msg="Forcibly stopping sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\"" Feb 13 19:53:51.676363 containerd[1536]: time="2025-02-13T19:53:51.676333128Z" level=info msg="TearDown network for sandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" successfully" Feb 13 19:53:51.677377 containerd[1536]: time="2025-02-13T19:53:51.677362861Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.677407 containerd[1536]: time="2025-02-13T19:53:51.677384231Z" level=info msg="RemovePodSandbox \"3ed4ef772c93616247d80a816ca9913fd024a3896a23dba3f062f710f9f14166\" returns successfully" Feb 13 19:53:51.678591 containerd[1536]: time="2025-02-13T19:53:51.677596897Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\"" Feb 13 19:53:51.678591 containerd[1536]: time="2025-02-13T19:53:51.677643895Z" level=info msg="TearDown network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" successfully" Feb 13 19:53:51.678591 containerd[1536]: time="2025-02-13T19:53:51.677653445Z" level=info msg="StopPodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" returns successfully" Feb 13 19:53:51.678591 containerd[1536]: time="2025-02-13T19:53:51.677765065Z" level=info msg="RemovePodSandbox for \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\"" Feb 13 19:53:51.678591 containerd[1536]: time="2025-02-13T19:53:51.677777697Z" level=info msg="Forcibly stopping sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\"" Feb 13 19:53:51.678591 containerd[1536]: time="2025-02-13T19:53:51.677808901Z" level=info msg="TearDown network for sandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" successfully" Feb 13 19:53:51.679375 containerd[1536]: time="2025-02-13T19:53:51.679126218Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.679375 containerd[1536]: time="2025-02-13T19:53:51.679146579Z" level=info msg="RemovePodSandbox \"3a5fe20f3374406c7abc245cc76b7d189630ede0737ef83cfd1a629234adbddf\" returns successfully" Feb 13 19:53:51.679375 containerd[1536]: time="2025-02-13T19:53:51.679261842Z" level=info msg="StopPodSandbox for \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\"" Feb 13 19:53:51.679375 containerd[1536]: time="2025-02-13T19:53:51.679296875Z" level=info msg="TearDown network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" successfully" Feb 13 19:53:51.679375 containerd[1536]: time="2025-02-13T19:53:51.679302286Z" level=info msg="StopPodSandbox for \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" returns successfully" Feb 13 19:53:51.679612 containerd[1536]: time="2025-02-13T19:53:51.679601787Z" level=info msg="RemovePodSandbox for \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\"" Feb 13 19:53:51.679702 containerd[1536]: time="2025-02-13T19:53:51.679685847Z" level=info msg="Forcibly stopping sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\"" Feb 13 19:53:51.679792 containerd[1536]: time="2025-02-13T19:53:51.679773773Z" level=info msg="TearDown network for sandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" successfully" Feb 13 19:53:51.680969 containerd[1536]: time="2025-02-13T19:53:51.680854648Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.680969 containerd[1536]: time="2025-02-13T19:53:51.680875609Z" level=info msg="RemovePodSandbox \"4cf47867adde05ea7af70d69861a88734926006ea9dfc8a4cc38b424ba4af5d5\" returns successfully" Feb 13 19:53:51.681045 containerd[1536]: time="2025-02-13T19:53:51.681004573Z" level=info msg="StopPodSandbox for \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\"" Feb 13 19:53:51.681045 containerd[1536]: time="2025-02-13T19:53:51.681039321Z" level=info msg="TearDown network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\" successfully" Feb 13 19:53:51.681045 containerd[1536]: time="2025-02-13T19:53:51.681044498Z" level=info msg="StopPodSandbox for \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\" returns successfully" Feb 13 19:53:51.681259 containerd[1536]: time="2025-02-13T19:53:51.681245063Z" level=info msg="RemovePodSandbox for \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\"" Feb 13 19:53:51.681259 containerd[1536]: time="2025-02-13T19:53:51.681258030Z" level=info msg="Forcibly stopping sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\"" Feb 13 19:53:51.681320 containerd[1536]: time="2025-02-13T19:53:51.681284446Z" level=info msg="TearDown network for sandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\" successfully" Feb 13 19:53:51.682318 containerd[1536]: time="2025-02-13T19:53:51.682301875Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.682695 containerd[1536]: time="2025-02-13T19:53:51.682322470Z" level=info msg="RemovePodSandbox \"261c1a835ba009dd3d641bd26e5e0684fd0e25bd50ff9b037c2cc295f0c2f86a\" returns successfully" Feb 13 19:53:51.682695 containerd[1536]: time="2025-02-13T19:53:51.682546929Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:51.682695 containerd[1536]: time="2025-02-13T19:53:51.682645963Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:51.682851 containerd[1536]: time="2025-02-13T19:53:51.682653669Z" level=info msg="StopPodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:51.682903 containerd[1536]: time="2025-02-13T19:53:51.682886253Z" level=info msg="RemovePodSandbox for \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:51.682930 containerd[1536]: time="2025-02-13T19:53:51.682900955Z" level=info msg="Forcibly stopping sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\"" Feb 13 19:53:51.682947 containerd[1536]: time="2025-02-13T19:53:51.682932845Z" level=info msg="TearDown network for sandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" successfully" Feb 13 19:53:51.683956 containerd[1536]: time="2025-02-13T19:53:51.683940688Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.684106 containerd[1536]: time="2025-02-13T19:53:51.683961496Z" level=info msg="RemovePodSandbox \"d1252dcb5dfdfc77be3bb7af75faa5122b7609fcc7fc91bf395cee6ba5215772\" returns successfully" Feb 13 19:53:51.684152 containerd[1536]: time="2025-02-13T19:53:51.684134150Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:51.684243 containerd[1536]: time="2025-02-13T19:53:51.684173004Z" level=info msg="TearDown network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" successfully" Feb 13 19:53:51.684243 containerd[1536]: time="2025-02-13T19:53:51.684180510Z" level=info msg="StopPodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" returns successfully" Feb 13 19:53:51.684297 containerd[1536]: time="2025-02-13T19:53:51.684283526Z" level=info msg="RemovePodSandbox for \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:51.684297 containerd[1536]: time="2025-02-13T19:53:51.684293150Z" level=info msg="Forcibly stopping sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\"" Feb 13 19:53:51.684368 containerd[1536]: time="2025-02-13T19:53:51.684335858Z" level=info msg="TearDown network for sandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" successfully" Feb 13 19:53:51.685371 containerd[1536]: time="2025-02-13T19:53:51.685356565Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.685413 containerd[1536]: time="2025-02-13T19:53:51.685377914Z" level=info msg="RemovePodSandbox \"3189c94e8d11ca268f2d52da17135e4d2f914a7d1cb1d77e083d0dab7acb2450\" returns successfully" Feb 13 19:53:51.685690 containerd[1536]: time="2025-02-13T19:53:51.685590139Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" Feb 13 19:53:51.685690 containerd[1536]: time="2025-02-13T19:53:51.685646234Z" level=info msg="TearDown network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" successfully" Feb 13 19:53:51.685690 containerd[1536]: time="2025-02-13T19:53:51.685654091Z" level=info msg="StopPodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" returns successfully" Feb 13 19:53:51.686007 containerd[1536]: time="2025-02-13T19:53:51.685855625Z" level=info msg="RemovePodSandbox for \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" Feb 13 19:53:51.686007 containerd[1536]: time="2025-02-13T19:53:51.685867630Z" level=info msg="Forcibly stopping sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\"" Feb 13 19:53:51.686007 containerd[1536]: time="2025-02-13T19:53:51.685943647Z" level=info msg="TearDown network for sandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" successfully" Feb 13 19:53:51.687170 containerd[1536]: time="2025-02-13T19:53:51.687115218Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.687170 containerd[1536]: time="2025-02-13T19:53:51.687136386Z" level=info msg="RemovePodSandbox \"d10647b0f3da684e3a617c035d3ef60ec493152eb3ba34a1986842da8a147ce6\" returns successfully" Feb 13 19:53:51.688243 containerd[1536]: time="2025-02-13T19:53:51.687374469Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\"" Feb 13 19:53:51.688243 containerd[1536]: time="2025-02-13T19:53:51.687416795Z" level=info msg="TearDown network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" successfully" Feb 13 19:53:51.688243 containerd[1536]: time="2025-02-13T19:53:51.687423173Z" level=info msg="StopPodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" returns successfully" Feb 13 19:53:51.688243 containerd[1536]: time="2025-02-13T19:53:51.687550063Z" level=info msg="RemovePodSandbox for \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\"" Feb 13 19:53:51.688243 containerd[1536]: time="2025-02-13T19:53:51.687559758Z" level=info msg="Forcibly stopping sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\"" Feb 13 19:53:51.688243 containerd[1536]: time="2025-02-13T19:53:51.687587914Z" level=info msg="TearDown network for sandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" successfully" Feb 13 19:53:51.688823 containerd[1536]: time="2025-02-13T19:53:51.688808120Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.688853 containerd[1536]: time="2025-02-13T19:53:51.688830728Z" level=info msg="RemovePodSandbox \"546a78de7df6517918fa8cdadb3acc3350d9e97acd84445f52e290102d23ea7c\" returns successfully" Feb 13 19:53:51.689050 containerd[1536]: time="2025-02-13T19:53:51.689015064Z" level=info msg="StopPodSandbox for \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\"" Feb 13 19:53:51.689191 containerd[1536]: time="2025-02-13T19:53:51.689106467Z" level=info msg="TearDown network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" successfully" Feb 13 19:53:51.689191 containerd[1536]: time="2025-02-13T19:53:51.689114859Z" level=info msg="StopPodSandbox for \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" returns successfully" Feb 13 19:53:51.690078 containerd[1536]: time="2025-02-13T19:53:51.689315509Z" level=info msg="RemovePodSandbox for \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\"" Feb 13 19:53:51.690078 containerd[1536]: time="2025-02-13T19:53:51.689327998Z" level=info msg="Forcibly stopping sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\"" Feb 13 19:53:51.690078 containerd[1536]: time="2025-02-13T19:53:51.689359472Z" level=info msg="TearDown network for sandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" successfully" Feb 13 19:53:51.690488 containerd[1536]: time="2025-02-13T19:53:51.690476357Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.690567 containerd[1536]: time="2025-02-13T19:53:51.690557645Z" level=info msg="RemovePodSandbox \"32cc08989513c5c464d4f2015dec0178d40b8a1836f9ff787da9668bf19af4a3\" returns successfully" Feb 13 19:53:51.690760 containerd[1536]: time="2025-02-13T19:53:51.690750675Z" level=info msg="StopPodSandbox for \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\"" Feb 13 19:53:51.690868 containerd[1536]: time="2025-02-13T19:53:51.690859572Z" level=info msg="TearDown network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\" successfully" Feb 13 19:53:51.690930 containerd[1536]: time="2025-02-13T19:53:51.690922018Z" level=info msg="StopPodSandbox for \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\" returns successfully" Feb 13 19:53:51.691113 containerd[1536]: time="2025-02-13T19:53:51.691103209Z" level=info msg="RemovePodSandbox for \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\"" Feb 13 19:53:51.691203 containerd[1536]: time="2025-02-13T19:53:51.691194718Z" level=info msg="Forcibly stopping sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\"" Feb 13 19:53:51.691327 containerd[1536]: time="2025-02-13T19:53:51.691299639Z" level=info msg="TearDown network for sandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\" successfully" Feb 13 19:53:51.692371 containerd[1536]: time="2025-02-13T19:53:51.692357836Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.692442 containerd[1536]: time="2025-02-13T19:53:51.692433472Z" level=info msg="RemovePodSandbox \"3b95cfde0d60fa586ff5f53f63013fe0ba1c2b0b7e84f59d1c4456aff3ced021\" returns successfully" Feb 13 19:53:51.692602 containerd[1536]: time="2025-02-13T19:53:51.692584620Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:51.692892 containerd[1536]: time="2025-02-13T19:53:51.692624778Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:51.692892 containerd[1536]: time="2025-02-13T19:53:51.692631161Z" level=info msg="StopPodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:51.692892 containerd[1536]: time="2025-02-13T19:53:51.692798186Z" level=info msg="RemovePodSandbox for \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:51.692892 containerd[1536]: time="2025-02-13T19:53:51.692809061Z" level=info msg="Forcibly stopping sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\"" Feb 13 19:53:51.693504 containerd[1536]: time="2025-02-13T19:53:51.693023667Z" level=info msg="TearDown network for sandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" successfully" Feb 13 19:53:51.694173 containerd[1536]: time="2025-02-13T19:53:51.694160909Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.694228 containerd[1536]: time="2025-02-13T19:53:51.694219261Z" level=info msg="RemovePodSandbox \"a291e52e25b37749384ee5514ea01c8c0cfcc6c2288822986df3f0918ba0b8e2\" returns successfully" Feb 13 19:53:51.694396 containerd[1536]: time="2025-02-13T19:53:51.694382780Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:51.694436 containerd[1536]: time="2025-02-13T19:53:51.694420666Z" level=info msg="TearDown network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" successfully" Feb 13 19:53:51.694436 containerd[1536]: time="2025-02-13T19:53:51.694428569Z" level=info msg="StopPodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" returns successfully" Feb 13 19:53:51.694639 containerd[1536]: time="2025-02-13T19:53:51.694619535Z" level=info msg="RemovePodSandbox for \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:51.694725 containerd[1536]: time="2025-02-13T19:53:51.694717096Z" level=info msg="Forcibly stopping sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\"" Feb 13 19:53:51.694814 containerd[1536]: time="2025-02-13T19:53:51.694796133Z" level=info msg="TearDown network for sandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" successfully" Feb 13 19:53:51.695943 containerd[1536]: time="2025-02-13T19:53:51.695931186Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.696037 containerd[1536]: time="2025-02-13T19:53:51.696026919Z" level=info msg="RemovePodSandbox \"c53e84917aa2aeb6b842290d063d9f194481d8cca2dc4a92f0c83255d8bb2d0f\" returns successfully" Feb 13 19:53:51.696202 containerd[1536]: time="2025-02-13T19:53:51.696191048Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" Feb 13 19:53:51.696247 containerd[1536]: time="2025-02-13T19:53:51.696228043Z" level=info msg="TearDown network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" successfully" Feb 13 19:53:51.696247 containerd[1536]: time="2025-02-13T19:53:51.696233848Z" level=info msg="StopPodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" returns successfully" Feb 13 19:53:51.696519 containerd[1536]: time="2025-02-13T19:53:51.696361147Z" level=info msg="RemovePodSandbox for \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" Feb 13 19:53:51.696519 containerd[1536]: time="2025-02-13T19:53:51.696372060Z" level=info msg="Forcibly stopping sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\"" Feb 13 19:53:51.697076 containerd[1536]: time="2025-02-13T19:53:51.696458700Z" level=info msg="TearDown network for sandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" successfully" Feb 13 19:53:51.697786 containerd[1536]: time="2025-02-13T19:53:51.697759980Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.697841 containerd[1536]: time="2025-02-13T19:53:51.697832484Z" level=info msg="RemovePodSandbox \"066f2d9588d3a850c7568cc55f66e353f1ea0c9761e5ef1eca466d8ddec79c95\" returns successfully" Feb 13 19:53:51.698009 containerd[1536]: time="2025-02-13T19:53:51.697975718Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\"" Feb 13 19:53:51.698046 containerd[1536]: time="2025-02-13T19:53:51.698026978Z" level=info msg="TearDown network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" successfully" Feb 13 19:53:51.698046 containerd[1536]: time="2025-02-13T19:53:51.698033866Z" level=info msg="StopPodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" returns successfully" Feb 13 19:53:51.698206 containerd[1536]: time="2025-02-13T19:53:51.698192042Z" level=info msg="RemovePodSandbox for \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\"" Feb 13 19:53:51.698240 containerd[1536]: time="2025-02-13T19:53:51.698206429Z" level=info msg="Forcibly stopping sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\"" Feb 13 19:53:51.698274 containerd[1536]: time="2025-02-13T19:53:51.698234300Z" level=info msg="TearDown network for sandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" successfully" Feb 13 19:53:51.699254 containerd[1536]: time="2025-02-13T19:53:51.699239096Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.699280 containerd[1536]: time="2025-02-13T19:53:51.699263647Z" level=info msg="RemovePodSandbox \"b09a10ce3d181cfded2536646f0584fa1e2300d7e98b1aba9f901123dbed60c8\" returns successfully" Feb 13 19:53:51.699426 containerd[1536]: time="2025-02-13T19:53:51.699412672Z" level=info msg="StopPodSandbox for \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\"" Feb 13 19:53:51.699497 containerd[1536]: time="2025-02-13T19:53:51.699454802Z" level=info msg="TearDown network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" successfully" Feb 13 19:53:51.699497 containerd[1536]: time="2025-02-13T19:53:51.699461240Z" level=info msg="StopPodSandbox for \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" returns successfully" Feb 13 19:53:51.699889 containerd[1536]: time="2025-02-13T19:53:51.699641894Z" level=info msg="RemovePodSandbox for \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\"" Feb 13 19:53:51.699889 containerd[1536]: time="2025-02-13T19:53:51.699654623Z" level=info msg="Forcibly stopping sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\"" Feb 13 19:53:51.699889 containerd[1536]: time="2025-02-13T19:53:51.699686897Z" level=info msg="TearDown network for sandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" successfully" Feb 13 19:53:51.700892 containerd[1536]: time="2025-02-13T19:53:51.700880344Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.701133 containerd[1536]: time="2025-02-13T19:53:51.701009426Z" level=info msg="RemovePodSandbox \"63569d9a692746652e07bd77a073a09c7d91b9d8e6fafd511be2ef0b0bd62929\" returns successfully" Feb 13 19:53:51.701172 containerd[1536]: time="2025-02-13T19:53:51.701143886Z" level=info msg="StopPodSandbox for \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\"" Feb 13 19:53:51.701191 containerd[1536]: time="2025-02-13T19:53:51.701180046Z" level=info msg="TearDown network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\" successfully" Feb 13 19:53:51.701191 containerd[1536]: time="2025-02-13T19:53:51.701185601Z" level=info msg="StopPodSandbox for \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\" returns successfully" Feb 13 19:53:51.701380 containerd[1536]: time="2025-02-13T19:53:51.701302796Z" level=info msg="RemovePodSandbox for \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\"" Feb 13 19:53:51.701380 containerd[1536]: time="2025-02-13T19:53:51.701312398Z" level=info msg="Forcibly stopping sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\"" Feb 13 19:53:51.701380 containerd[1536]: time="2025-02-13T19:53:51.701338270Z" level=info msg="TearDown network for sandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\" successfully" Feb 13 19:53:51.702377 containerd[1536]: time="2025-02-13T19:53:51.702361623Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.702433 containerd[1536]: time="2025-02-13T19:53:51.702383911Z" level=info msg="RemovePodSandbox \"47d012389fda74d33c8fe476fcb364d4f885d05a593c736d7c4ce85ea69aa6cb\" returns successfully" Feb 13 19:53:51.702634 containerd[1536]: time="2025-02-13T19:53:51.702596670Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:51.702747 containerd[1536]: time="2025-02-13T19:53:51.702688410Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:51.702747 containerd[1536]: time="2025-02-13T19:53:51.702696043Z" level=info msg="StopPodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:51.702910 containerd[1536]: time="2025-02-13T19:53:51.702893442Z" level=info msg="RemovePodSandbox for \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:51.702910 containerd[1536]: time="2025-02-13T19:53:51.702907884Z" level=info msg="Forcibly stopping sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\"" Feb 13 19:53:51.702962 containerd[1536]: time="2025-02-13T19:53:51.702943212Z" level=info msg="TearDown network for sandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" successfully" Feb 13 19:53:51.704592 containerd[1536]: time="2025-02-13T19:53:51.704577099Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.704636 containerd[1536]: time="2025-02-13T19:53:51.704611034Z" level=info msg="RemovePodSandbox \"2d5d438ae06709b7d916c863cf0e48b276f7ff2b0b13f76bd0573fe0959b283a\" returns successfully" Feb 13 19:53:51.704762 containerd[1536]: time="2025-02-13T19:53:51.704741708Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:51.704809 containerd[1536]: time="2025-02-13T19:53:51.704787087Z" level=info msg="TearDown network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" successfully" Feb 13 19:53:51.704809 containerd[1536]: time="2025-02-13T19:53:51.704797009Z" level=info msg="StopPodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" returns successfully" Feb 13 19:53:51.705031 containerd[1536]: time="2025-02-13T19:53:51.704978336Z" level=info msg="RemovePodSandbox for \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:51.705031 containerd[1536]: time="2025-02-13T19:53:51.705008407Z" level=info msg="Forcibly stopping sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\"" Feb 13 19:53:51.705096 containerd[1536]: time="2025-02-13T19:53:51.705060128Z" level=info msg="TearDown network for sandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" successfully" Feb 13 19:53:51.706106 containerd[1536]: time="2025-02-13T19:53:51.706091733Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.706220 containerd[1536]: time="2025-02-13T19:53:51.706114513Z" level=info msg="RemovePodSandbox \"b90a5569d38de8fefccc65914779c2848bdf017b3bcb504317fcc422e60663a0\" returns successfully" Feb 13 19:53:51.706249 containerd[1536]: time="2025-02-13T19:53:51.706237486Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" Feb 13 19:53:51.706351 containerd[1536]: time="2025-02-13T19:53:51.706275636Z" level=info msg="TearDown network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" successfully" Feb 13 19:53:51.706351 containerd[1536]: time="2025-02-13T19:53:51.706283855Z" level=info msg="StopPodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" returns successfully" Feb 13 19:53:51.706415 containerd[1536]: time="2025-02-13T19:53:51.706396385Z" level=info msg="RemovePodSandbox for \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" Feb 13 19:53:51.706415 containerd[1536]: time="2025-02-13T19:53:51.706409139Z" level=info msg="Forcibly stopping sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\"" Feb 13 19:53:51.706502 containerd[1536]: time="2025-02-13T19:53:51.706468406Z" level=info msg="TearDown network for sandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" successfully" Feb 13 19:53:51.707625 containerd[1536]: time="2025-02-13T19:53:51.707610952Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.707718 containerd[1536]: time="2025-02-13T19:53:51.707633626Z" level=info msg="RemovePodSandbox \"3e74f1dd8df79ac46ee39733279b0377247ea1c08483a4c9ea0057b7e13c248c\" returns successfully" Feb 13 19:53:51.707916 containerd[1536]: time="2025-02-13T19:53:51.707787636Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\"" Feb 13 19:53:51.707916 containerd[1536]: time="2025-02-13T19:53:51.707828444Z" level=info msg="TearDown network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" successfully" Feb 13 19:53:51.707916 containerd[1536]: time="2025-02-13T19:53:51.707834720Z" level=info msg="StopPodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" returns successfully" Feb 13 19:53:51.708717 containerd[1536]: time="2025-02-13T19:53:51.708224444Z" level=info msg="RemovePodSandbox for \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\"" Feb 13 19:53:51.708717 containerd[1536]: time="2025-02-13T19:53:51.708242362Z" level=info msg="Forcibly stopping sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\"" Feb 13 19:53:51.708717 containerd[1536]: time="2025-02-13T19:53:51.708299008Z" level=info msg="TearDown network for sandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" successfully" Feb 13 19:53:51.709668 containerd[1536]: time="2025-02-13T19:53:51.709654516Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.709738 containerd[1536]: time="2025-02-13T19:53:51.709726216Z" level=info msg="RemovePodSandbox \"02a805b82106e62be688d62258b99253eef4860a2282275088b0eed99fa81573\" returns successfully" Feb 13 19:53:51.709926 containerd[1536]: time="2025-02-13T19:53:51.709916436Z" level=info msg="StopPodSandbox for \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\"" Feb 13 19:53:51.710037 containerd[1536]: time="2025-02-13T19:53:51.710025459Z" level=info msg="TearDown network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" successfully" Feb 13 19:53:51.710235 containerd[1536]: time="2025-02-13T19:53:51.710130203Z" level=info msg="StopPodSandbox for \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" returns successfully" Feb 13 19:53:51.711278 containerd[1536]: time="2025-02-13T19:53:51.710315807Z" level=info msg="RemovePodSandbox for \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\"" Feb 13 19:53:51.711278 containerd[1536]: time="2025-02-13T19:53:51.710333880Z" level=info msg="Forcibly stopping sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\"" Feb 13 19:53:51.711278 containerd[1536]: time="2025-02-13T19:53:51.710371185Z" level=info msg="TearDown network for sandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" successfully" Feb 13 19:53:51.711710 containerd[1536]: time="2025-02-13T19:53:51.711697860Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.711787 containerd[1536]: time="2025-02-13T19:53:51.711776030Z" level=info msg="RemovePodSandbox \"31db4929c1e333135730e35a0af0d2b7197d9e90aed8159b0d0e91c7ef189cbb\" returns successfully" Feb 13 19:53:51.711941 containerd[1536]: time="2025-02-13T19:53:51.711931455Z" level=info msg="StopPodSandbox for \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\"" Feb 13 19:53:51.712056 containerd[1536]: time="2025-02-13T19:53:51.712035116Z" level=info msg="TearDown network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\" successfully" Feb 13 19:53:51.712056 containerd[1536]: time="2025-02-13T19:53:51.712047224Z" level=info msg="StopPodSandbox for \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\" returns successfully" Feb 13 19:53:51.712175 containerd[1536]: time="2025-02-13T19:53:51.712161590Z" level=info msg="RemovePodSandbox for \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\"" Feb 13 19:53:51.712175 containerd[1536]: time="2025-02-13T19:53:51.712171970Z" level=info msg="Forcibly stopping sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\"" Feb 13 19:53:51.712289 containerd[1536]: time="2025-02-13T19:53:51.712205068Z" level=info msg="TearDown network for sandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\" successfully" Feb 13 19:53:51.713281 containerd[1536]: time="2025-02-13T19:53:51.713264464Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.713322 containerd[1536]: time="2025-02-13T19:53:51.713289016Z" level=info msg="RemovePodSandbox \"624cb779a76084117b8e3dc99a20013d2c2371e45f9f9e2bf6f24304a89f14e6\" returns successfully" Feb 13 19:53:51.713597 containerd[1536]: time="2025-02-13T19:53:51.713478788Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:51.713597 containerd[1536]: time="2025-02-13T19:53:51.713539063Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:51.713597 containerd[1536]: time="2025-02-13T19:53:51.713547256Z" level=info msg="StopPodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:51.713791 containerd[1536]: time="2025-02-13T19:53:51.713776135Z" level=info msg="RemovePodSandbox for \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:51.713791 containerd[1536]: time="2025-02-13T19:53:51.713790838Z" level=info msg="Forcibly stopping sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\"" Feb 13 19:53:51.713841 containerd[1536]: time="2025-02-13T19:53:51.713821704Z" level=info msg="TearDown network for sandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" successfully" Feb 13 19:53:51.715016 containerd[1536]: time="2025-02-13T19:53:51.714994490Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.715065 containerd[1536]: time="2025-02-13T19:53:51.715034378Z" level=info msg="RemovePodSandbox \"3960bd17c8f4d4aeb40a0972f2c3080a059ec850bec406ad0a124977d2d32811\" returns successfully" Feb 13 19:53:51.715361 containerd[1536]: time="2025-02-13T19:53:51.715229800Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:51.715361 containerd[1536]: time="2025-02-13T19:53:51.715276450Z" level=info msg="TearDown network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" successfully" Feb 13 19:53:51.715361 containerd[1536]: time="2025-02-13T19:53:51.715283439Z" level=info msg="StopPodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" returns successfully" Feb 13 19:53:51.715674 containerd[1536]: time="2025-02-13T19:53:51.715584030Z" level=info msg="RemovePodSandbox for \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:51.715674 containerd[1536]: time="2025-02-13T19:53:51.715596629Z" level=info msg="Forcibly stopping sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\"" Feb 13 19:53:51.715674 containerd[1536]: time="2025-02-13T19:53:51.715634965Z" level=info msg="TearDown network for sandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" successfully" Feb 13 19:53:51.718001 containerd[1536]: time="2025-02-13T19:53:51.717975882Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.718931 containerd[1536]: time="2025-02-13T19:53:51.718069547Z" level=info msg="RemovePodSandbox \"9395736a8ce739dbd1046e5e73f44355f3dfad726aae6446780a94f418c1433f\" returns successfully" Feb 13 19:53:51.719486 containerd[1536]: time="2025-02-13T19:53:51.719474974Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" Feb 13 19:53:51.719581 containerd[1536]: time="2025-02-13T19:53:51.719571812Z" level=info msg="TearDown network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" successfully" Feb 13 19:53:51.719809 containerd[1536]: time="2025-02-13T19:53:51.719722333Z" level=info msg="StopPodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" returns successfully" Feb 13 19:53:51.721707 containerd[1536]: time="2025-02-13T19:53:51.721695457Z" level=info msg="RemovePodSandbox for \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" Feb 13 19:53:51.721757 containerd[1536]: time="2025-02-13T19:53:51.721749828Z" level=info msg="Forcibly stopping sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\"" Feb 13 19:53:51.721832 containerd[1536]: time="2025-02-13T19:53:51.721813937Z" level=info msg="TearDown network for sandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" successfully" Feb 13 19:53:51.722975 containerd[1536]: time="2025-02-13T19:53:51.722963174Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.723073 containerd[1536]: time="2025-02-13T19:53:51.723063164Z" level=info msg="RemovePodSandbox \"36404b85237af91279e9774b1c1ee9e20f1e8be8f751924614dedab023e4f2ec\" returns successfully" Feb 13 19:53:51.723291 containerd[1536]: time="2025-02-13T19:53:51.723270229Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\"" Feb 13 19:53:51.723322 containerd[1536]: time="2025-02-13T19:53:51.723314212Z" level=info msg="TearDown network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" successfully" Feb 13 19:53:51.723340 containerd[1536]: time="2025-02-13T19:53:51.723321075Z" level=info msg="StopPodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" returns successfully" Feb 13 19:53:51.723486 containerd[1536]: time="2025-02-13T19:53:51.723455844Z" level=info msg="RemovePodSandbox for \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\"" Feb 13 19:53:51.723486 containerd[1536]: time="2025-02-13T19:53:51.723466271Z" level=info msg="Forcibly stopping sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\"" Feb 13 19:53:51.723533 containerd[1536]: time="2025-02-13T19:53:51.723501795Z" level=info msg="TearDown network for sandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" successfully" Feb 13 19:53:51.724789 containerd[1536]: time="2025-02-13T19:53:51.724774574Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.724832 containerd[1536]: time="2025-02-13T19:53:51.724795221Z" level=info msg="RemovePodSandbox \"ef2762072c6faca1e63ccd48076910e85f69e884c0a21b4c503bc48959f7dd03\" returns successfully" Feb 13 19:53:51.725143 containerd[1536]: time="2025-02-13T19:53:51.724989749Z" level=info msg="StopPodSandbox for \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\"" Feb 13 19:53:51.725143 containerd[1536]: time="2025-02-13T19:53:51.725101232Z" level=info msg="TearDown network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" successfully" Feb 13 19:53:51.725143 containerd[1536]: time="2025-02-13T19:53:51.725109040Z" level=info msg="StopPodSandbox for \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" returns successfully" Feb 13 19:53:51.725328 containerd[1536]: time="2025-02-13T19:53:51.725318963Z" level=info msg="RemovePodSandbox for \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\"" Feb 13 19:53:51.725725 containerd[1536]: time="2025-02-13T19:53:51.725394992Z" level=info msg="Forcibly stopping sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\"" Feb 13 19:53:51.725725 containerd[1536]: time="2025-02-13T19:53:51.725426632Z" level=info msg="TearDown network for sandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" successfully" Feb 13 19:53:51.726617 containerd[1536]: time="2025-02-13T19:53:51.726605538Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.726751 containerd[1536]: time="2025-02-13T19:53:51.726725780Z" level=info msg="RemovePodSandbox \"15c15966511cb600a6c8954e055de241446c06aefd1a1da1210849bd6dd25895\" returns successfully" Feb 13 19:53:51.726953 containerd[1536]: time="2025-02-13T19:53:51.726943262Z" level=info msg="StopPodSandbox for \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\"" Feb 13 19:53:51.727084 containerd[1536]: time="2025-02-13T19:53:51.727074387Z" level=info msg="TearDown network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\" successfully" Feb 13 19:53:51.727257 containerd[1536]: time="2025-02-13T19:53:51.727159108Z" level=info msg="StopPodSandbox for \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\" returns successfully" Feb 13 19:53:51.728106 containerd[1536]: time="2025-02-13T19:53:51.727323157Z" level=info msg="RemovePodSandbox for \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\"" Feb 13 19:53:51.728106 containerd[1536]: time="2025-02-13T19:53:51.727339058Z" level=info msg="Forcibly stopping sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\"" Feb 13 19:53:51.728106 containerd[1536]: time="2025-02-13T19:53:51.727373609Z" level=info msg="TearDown network for sandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\" successfully" Feb 13 19:53:51.728583 containerd[1536]: time="2025-02-13T19:53:51.728571107Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Feb 13 19:53:51.728677 containerd[1536]: time="2025-02-13T19:53:51.728667336Z" level=info msg="RemovePodSandbox \"7f34696bd509853306d333bbf54aa756f7dcba2107a935bc549e7eb782c14996\" returns successfully" Feb 13 19:54:04.899632 kubelet[2855]: I0213 19:54:04.899559 2855 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-kzqv2" podStartSLOduration=43.520103441 podStartE2EDuration="51.899219349s" podCreationTimestamp="2025-02-13 19:53:13 +0000 UTC" firstStartedPulling="2025-02-13 19:53:34.118480974 +0000 UTC m=+42.681115368" lastFinishedPulling="2025-02-13 19:53:42.497596882 +0000 UTC m=+51.060231276" observedRunningTime="2025-02-13 19:53:43.542213168 +0000 UTC m=+52.104847572" watchObservedRunningTime="2025-02-13 19:54:04.899219349 +0000 UTC m=+73.461853745" Feb 13 19:54:07.986194 systemd[1]: Started sshd@7-139.178.70.104:22-147.75.109.163:44794.service - OpenSSH per-connection server daemon (147.75.109.163:44794). Feb 13 19:54:08.428443 sshd[5920]: Accepted publickey for core from 147.75.109.163 port 44794 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:08.431170 sshd-session[5920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:08.434657 systemd-logind[1517]: New session 10 of user core. Feb 13 19:54:08.439081 systemd[1]: Started session-10.scope - Session 10 of User core. Feb 13 19:54:08.943934 sshd[5924]: Connection closed by 147.75.109.163 port 44794 Feb 13 19:54:08.945152 sshd-session[5920]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:08.947607 systemd[1]: sshd@7-139.178.70.104:22-147.75.109.163:44794.service: Deactivated successfully. Feb 13 19:54:08.949391 systemd[1]: session-10.scope: Deactivated successfully. Feb 13 19:54:08.951377 systemd-logind[1517]: Session 10 logged out. Waiting for processes to exit. Feb 13 19:54:08.952820 systemd-logind[1517]: Removed session 10. Feb 13 19:54:13.952691 systemd[1]: Started sshd@8-139.178.70.104:22-147.75.109.163:52128.service - OpenSSH per-connection server daemon (147.75.109.163:52128). Feb 13 19:54:14.270478 sshd[5955]: Accepted publickey for core from 147.75.109.163 port 52128 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:14.271568 sshd-session[5955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:14.275794 systemd-logind[1517]: New session 11 of user core. Feb 13 19:54:14.288158 systemd[1]: Started session-11.scope - Session 11 of User core. Feb 13 19:54:14.444644 sshd[5957]: Connection closed by 147.75.109.163 port 52128 Feb 13 19:54:14.445095 sshd-session[5955]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:14.447597 systemd[1]: sshd@8-139.178.70.104:22-147.75.109.163:52128.service: Deactivated successfully. Feb 13 19:54:14.448802 systemd[1]: session-11.scope: Deactivated successfully. Feb 13 19:54:14.449269 systemd-logind[1517]: Session 11 logged out. Waiting for processes to exit. Feb 13 19:54:14.450254 systemd-logind[1517]: Removed session 11. Feb 13 19:54:16.011710 kubelet[2855]: I0213 19:54:16.002580 2855 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 13 19:54:19.454592 systemd[1]: Started sshd@9-139.178.70.104:22-147.75.109.163:52258.service - OpenSSH per-connection server daemon (147.75.109.163:52258). Feb 13 19:54:19.676747 sshd[5985]: Accepted publickey for core from 147.75.109.163 port 52258 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:19.677837 sshd-session[5985]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:19.681341 systemd-logind[1517]: New session 12 of user core. Feb 13 19:54:19.686089 systemd[1]: Started session-12.scope - Session 12 of User core. Feb 13 19:54:19.796193 sshd[5987]: Connection closed by 147.75.109.163 port 52258 Feb 13 19:54:19.795895 sshd-session[5985]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:19.803737 systemd[1]: sshd@9-139.178.70.104:22-147.75.109.163:52258.service: Deactivated successfully. Feb 13 19:54:19.804885 systemd[1]: session-12.scope: Deactivated successfully. Feb 13 19:54:19.805347 systemd-logind[1517]: Session 12 logged out. Waiting for processes to exit. Feb 13 19:54:19.813795 systemd[1]: Started sshd@10-139.178.70.104:22-147.75.109.163:52274.service - OpenSSH per-connection server daemon (147.75.109.163:52274). Feb 13 19:54:19.815140 systemd-logind[1517]: Removed session 12. Feb 13 19:54:19.849927 sshd[5998]: Accepted publickey for core from 147.75.109.163 port 52274 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:19.850823 sshd-session[5998]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:19.854084 systemd-logind[1517]: New session 13 of user core. Feb 13 19:54:19.857060 systemd[1]: Started session-13.scope - Session 13 of User core. Feb 13 19:54:20.063383 sshd[6000]: Connection closed by 147.75.109.163 port 52274 Feb 13 19:54:20.067190 sshd-session[5998]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:20.079216 systemd[1]: Started sshd@11-139.178.70.104:22-147.75.109.163:52282.service - OpenSSH per-connection server daemon (147.75.109.163:52282). Feb 13 19:54:20.079544 systemd[1]: sshd@10-139.178.70.104:22-147.75.109.163:52274.service: Deactivated successfully. Feb 13 19:54:20.080695 systemd[1]: session-13.scope: Deactivated successfully. Feb 13 19:54:20.087286 systemd-logind[1517]: Session 13 logged out. Waiting for processes to exit. Feb 13 19:54:20.091520 systemd-logind[1517]: Removed session 13. Feb 13 19:54:20.146742 sshd[6007]: Accepted publickey for core from 147.75.109.163 port 52282 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:20.151629 sshd-session[6007]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:20.154324 systemd-logind[1517]: New session 14 of user core. Feb 13 19:54:20.162067 systemd[1]: Started session-14.scope - Session 14 of User core. Feb 13 19:54:20.317501 sshd[6011]: Connection closed by 147.75.109.163 port 52282 Feb 13 19:54:20.317443 sshd-session[6007]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:20.329007 systemd[1]: sshd@11-139.178.70.104:22-147.75.109.163:52282.service: Deactivated successfully. Feb 13 19:54:20.330274 systemd[1]: session-14.scope: Deactivated successfully. Feb 13 19:54:20.331020 systemd-logind[1517]: Session 14 logged out. Waiting for processes to exit. Feb 13 19:54:20.331570 systemd-logind[1517]: Removed session 14. Feb 13 19:54:25.328496 systemd[1]: Started sshd@12-139.178.70.104:22-147.75.109.163:52290.service - OpenSSH per-connection server daemon (147.75.109.163:52290). Feb 13 19:54:25.600641 sshd[6021]: Accepted publickey for core from 147.75.109.163 port 52290 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:25.601516 sshd-session[6021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:25.604328 systemd-logind[1517]: New session 15 of user core. Feb 13 19:54:25.609104 systemd[1]: Started session-15.scope - Session 15 of User core. Feb 13 19:54:25.704217 sshd[6023]: Connection closed by 147.75.109.163 port 52290 Feb 13 19:54:25.704154 sshd-session[6021]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:25.706248 systemd-logind[1517]: Session 15 logged out. Waiting for processes to exit. Feb 13 19:54:25.706571 systemd[1]: sshd@12-139.178.70.104:22-147.75.109.163:52290.service: Deactivated successfully. Feb 13 19:54:25.707858 systemd[1]: session-15.scope: Deactivated successfully. Feb 13 19:54:25.708615 systemd-logind[1517]: Removed session 15. Feb 13 19:54:30.711795 systemd[1]: Started sshd@13-139.178.70.104:22-147.75.109.163:43424.service - OpenSSH per-connection server daemon (147.75.109.163:43424). Feb 13 19:54:30.905069 sshd[6056]: Accepted publickey for core from 147.75.109.163 port 43424 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:30.907928 sshd-session[6056]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:30.911114 systemd-logind[1517]: New session 16 of user core. Feb 13 19:54:30.916139 systemd[1]: Started session-16.scope - Session 16 of User core. Feb 13 19:54:31.419822 sshd[6058]: Connection closed by 147.75.109.163 port 43424 Feb 13 19:54:31.421808 systemd[1]: sshd@13-139.178.70.104:22-147.75.109.163:43424.service: Deactivated successfully. Feb 13 19:54:31.420212 sshd-session[6056]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:31.423227 systemd[1]: session-16.scope: Deactivated successfully. Feb 13 19:54:31.424366 systemd-logind[1517]: Session 16 logged out. Waiting for processes to exit. Feb 13 19:54:31.424897 systemd-logind[1517]: Removed session 16. Feb 13 19:54:36.431450 systemd[1]: Started sshd@14-139.178.70.104:22-147.75.109.163:43436.service - OpenSSH per-connection server daemon (147.75.109.163:43436). Feb 13 19:54:36.516910 sshd[6092]: Accepted publickey for core from 147.75.109.163 port 43436 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:36.518245 sshd-session[6092]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:36.521173 systemd-logind[1517]: New session 17 of user core. Feb 13 19:54:36.526060 systemd[1]: Started session-17.scope - Session 17 of User core. Feb 13 19:54:36.687568 sshd[6094]: Connection closed by 147.75.109.163 port 43436 Feb 13 19:54:36.688508 sshd-session[6092]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:36.694547 systemd[1]: sshd@14-139.178.70.104:22-147.75.109.163:43436.service: Deactivated successfully. Feb 13 19:54:36.695501 systemd[1]: session-17.scope: Deactivated successfully. Feb 13 19:54:36.696274 systemd-logind[1517]: Session 17 logged out. Waiting for processes to exit. Feb 13 19:54:36.697608 systemd[1]: Started sshd@15-139.178.70.104:22-147.75.109.163:43440.service - OpenSSH per-connection server daemon (147.75.109.163:43440). Feb 13 19:54:36.698815 systemd-logind[1517]: Removed session 17. Feb 13 19:54:36.743030 sshd[6105]: Accepted publickey for core from 147.75.109.163 port 43440 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:36.743796 sshd-session[6105]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:36.746350 systemd-logind[1517]: New session 18 of user core. Feb 13 19:54:36.752071 systemd[1]: Started session-18.scope - Session 18 of User core. Feb 13 19:54:37.155817 sshd[6107]: Connection closed by 147.75.109.163 port 43440 Feb 13 19:54:37.162965 sshd-session[6105]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:37.163351 systemd[1]: Started sshd@16-139.178.70.104:22-147.75.109.163:43446.service - OpenSSH per-connection server daemon (147.75.109.163:43446). Feb 13 19:54:37.168016 systemd[1]: sshd@15-139.178.70.104:22-147.75.109.163:43440.service: Deactivated successfully. Feb 13 19:54:37.169198 systemd[1]: session-18.scope: Deactivated successfully. Feb 13 19:54:37.170373 systemd-logind[1517]: Session 18 logged out. Waiting for processes to exit. Feb 13 19:54:37.171042 systemd-logind[1517]: Removed session 18. Feb 13 19:54:37.266553 sshd[6115]: Accepted publickey for core from 147.75.109.163 port 43446 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:37.267806 sshd-session[6115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:37.277118 systemd-logind[1517]: New session 19 of user core. Feb 13 19:54:37.280333 systemd[1]: Started session-19.scope - Session 19 of User core. Feb 13 19:54:38.770304 sshd[6119]: Connection closed by 147.75.109.163 port 43446 Feb 13 19:54:38.773447 sshd-session[6115]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:38.783589 systemd[1]: sshd@16-139.178.70.104:22-147.75.109.163:43446.service: Deactivated successfully. Feb 13 19:54:38.786534 systemd[1]: session-19.scope: Deactivated successfully. Feb 13 19:54:38.787750 systemd-logind[1517]: Session 19 logged out. Waiting for processes to exit. Feb 13 19:54:38.792925 systemd[1]: Started sshd@17-139.178.70.104:22-147.75.109.163:43450.service - OpenSSH per-connection server daemon (147.75.109.163:43450). Feb 13 19:54:38.797467 systemd-logind[1517]: Removed session 19. Feb 13 19:54:38.895779 sshd[6137]: Accepted publickey for core from 147.75.109.163 port 43450 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:38.897531 sshd-session[6137]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:38.901355 systemd-logind[1517]: New session 20 of user core. Feb 13 19:54:38.906093 systemd[1]: Started session-20.scope - Session 20 of User core. Feb 13 19:54:39.647136 sshd[6141]: Connection closed by 147.75.109.163 port 43450 Feb 13 19:54:39.647350 sshd-session[6137]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:39.653838 systemd[1]: sshd@17-139.178.70.104:22-147.75.109.163:43450.service: Deactivated successfully. Feb 13 19:54:39.655960 systemd[1]: session-20.scope: Deactivated successfully. Feb 13 19:54:39.657365 systemd-logind[1517]: Session 20 logged out. Waiting for processes to exit. Feb 13 19:54:39.664411 systemd[1]: Started sshd@18-139.178.70.104:22-147.75.109.163:56976.service - OpenSSH per-connection server daemon (147.75.109.163:56976). Feb 13 19:54:39.666108 systemd-logind[1517]: Removed session 20. Feb 13 19:54:39.726094 sshd[6150]: Accepted publickey for core from 147.75.109.163 port 56976 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:39.727020 sshd-session[6150]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:39.729514 systemd-logind[1517]: New session 21 of user core. Feb 13 19:54:39.737178 systemd[1]: Started session-21.scope - Session 21 of User core. Feb 13 19:54:39.862297 sshd[6152]: Connection closed by 147.75.109.163 port 56976 Feb 13 19:54:39.862647 sshd-session[6150]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:39.864583 systemd[1]: sshd@18-139.178.70.104:22-147.75.109.163:56976.service: Deactivated successfully. Feb 13 19:54:39.864641 systemd-logind[1517]: Session 21 logged out. Waiting for processes to exit. Feb 13 19:54:39.866268 systemd[1]: session-21.scope: Deactivated successfully. Feb 13 19:54:39.867259 systemd-logind[1517]: Removed session 21. Feb 13 19:54:44.871102 systemd[1]: Started sshd@19-139.178.70.104:22-147.75.109.163:56992.service - OpenSSH per-connection server daemon (147.75.109.163:56992). Feb 13 19:54:44.951827 sshd[6166]: Accepted publickey for core from 147.75.109.163 port 56992 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:44.953635 sshd-session[6166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:44.957940 systemd-logind[1517]: New session 22 of user core. Feb 13 19:54:44.964094 systemd[1]: Started session-22.scope - Session 22 of User core. Feb 13 19:54:45.098875 sshd[6168]: Connection closed by 147.75.109.163 port 56992 Feb 13 19:54:45.098814 sshd-session[6166]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:45.101237 systemd[1]: sshd@19-139.178.70.104:22-147.75.109.163:56992.service: Deactivated successfully. Feb 13 19:54:45.104093 systemd[1]: session-22.scope: Deactivated successfully. Feb 13 19:54:45.105818 systemd-logind[1517]: Session 22 logged out. Waiting for processes to exit. Feb 13 19:54:45.108177 systemd-logind[1517]: Removed session 22. Feb 13 19:54:50.113277 systemd[1]: Started sshd@20-139.178.70.104:22-147.75.109.163:42700.service - OpenSSH per-connection server daemon (147.75.109.163:42700). Feb 13 19:54:50.247407 sshd[6181]: Accepted publickey for core from 147.75.109.163 port 42700 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:50.248738 sshd-session[6181]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:50.251707 systemd-logind[1517]: New session 23 of user core. Feb 13 19:54:50.258218 systemd[1]: Started session-23.scope - Session 23 of User core. Feb 13 19:54:50.432847 sshd[6183]: Connection closed by 147.75.109.163 port 42700 Feb 13 19:54:50.433387 sshd-session[6181]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:50.435051 systemd-logind[1517]: Session 23 logged out. Waiting for processes to exit. Feb 13 19:54:50.435284 systemd[1]: sshd@20-139.178.70.104:22-147.75.109.163:42700.service: Deactivated successfully. Feb 13 19:54:50.436566 systemd[1]: session-23.scope: Deactivated successfully. Feb 13 19:54:50.437717 systemd-logind[1517]: Removed session 23. Feb 13 19:54:55.443806 systemd[1]: Started sshd@21-139.178.70.104:22-147.75.109.163:42716.service - OpenSSH per-connection server daemon (147.75.109.163:42716). Feb 13 19:54:55.486167 sshd[6204]: Accepted publickey for core from 147.75.109.163 port 42716 ssh2: RSA SHA256:4jl35jnnOH3rl7LWm5yPPSX4JqPO6ehKKs4ltsAjMcc Feb 13 19:54:55.487088 sshd-session[6204]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Feb 13 19:54:55.490019 systemd-logind[1517]: New session 24 of user core. Feb 13 19:54:55.497065 systemd[1]: Started session-24.scope - Session 24 of User core. Feb 13 19:54:55.610918 sshd[6206]: Connection closed by 147.75.109.163 port 42716 Feb 13 19:54:55.611475 sshd-session[6204]: pam_unix(sshd:session): session closed for user core Feb 13 19:54:55.613866 systemd[1]: sshd@21-139.178.70.104:22-147.75.109.163:42716.service: Deactivated successfully. Feb 13 19:54:55.615558 systemd[1]: session-24.scope: Deactivated successfully. Feb 13 19:54:55.616841 systemd-logind[1517]: Session 24 logged out. Waiting for processes to exit. Feb 13 19:54:55.617529 systemd-logind[1517]: Removed session 24.