Jan 13 21:04:20.741302 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 19:01:45 -00 2025 Jan 13 21:04:20.741318 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 21:04:20.741325 kernel: Disabled fast string operations Jan 13 21:04:20.741329 kernel: BIOS-provided physical RAM map: Jan 13 21:04:20.741333 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 21:04:20.741337 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 21:04:20.741343 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 21:04:20.741347 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 21:04:20.741351 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 21:04:20.741355 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 21:04:20.741359 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 21:04:20.741364 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 21:04:20.741368 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 21:04:20.741372 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 21:04:20.741378 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 21:04:20.741383 kernel: NX (Execute Disable) protection: active Jan 13 21:04:20.741388 kernel: APIC: Static calls initialized Jan 13 21:04:20.741393 kernel: SMBIOS 2.7 present. Jan 13 21:04:20.741400 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 21:04:20.741407 kernel: vmware: hypercall mode: 0x00 Jan 13 21:04:20.741412 kernel: Hypervisor detected: VMware Jan 13 21:04:20.741417 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 21:04:20.741423 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 21:04:20.741428 kernel: vmware: using clock offset of 3707271205 ns Jan 13 21:04:20.741433 kernel: tsc: Detected 3408.000 MHz processor Jan 13 21:04:20.741438 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 21:04:20.741443 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 21:04:20.741448 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 21:04:20.741453 kernel: total RAM covered: 3072M Jan 13 21:04:20.741458 kernel: Found optimal setting for mtrr clean up Jan 13 21:04:20.741463 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 21:04:20.741469 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 21:04:20.741474 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 21:04:20.741479 kernel: Using GB pages for direct mapping Jan 13 21:04:20.741484 kernel: ACPI: Early table checksum verification disabled Jan 13 21:04:20.741489 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 21:04:20.741494 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 21:04:20.741499 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 21:04:20.741503 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 21:04:20.741508 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 21:04:20.741516 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 21:04:20.741521 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 21:04:20.741526 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 21:04:20.741532 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 21:04:20.741537 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 21:04:20.741543 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 21:04:20.741548 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 21:04:20.741553 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 21:04:20.741558 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 21:04:20.741566 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 21:04:20.741571 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 21:04:20.741576 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 21:04:20.741581 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 21:04:20.741586 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 21:04:20.741591 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 21:04:20.741598 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 21:04:20.741606 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 21:04:20.741612 kernel: system APIC only can use physical flat Jan 13 21:04:20.741618 kernel: APIC: Switched APIC routing to: physical flat Jan 13 21:04:20.741623 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 21:04:20.741628 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 21:04:20.741633 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 21:04:20.741638 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 21:04:20.741642 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 21:04:20.741650 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 21:04:20.741655 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 21:04:20.741660 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 21:04:20.741665 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 21:04:20.741669 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 21:04:20.741674 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 21:04:20.741679 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 21:04:20.741684 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 21:04:20.741689 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 21:04:20.741694 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 21:04:20.741701 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 21:04:20.741708 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 21:04:20.741716 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 21:04:20.741724 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 21:04:20.741729 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 21:04:20.741734 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 21:04:20.741739 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 21:04:20.741744 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 21:04:20.741749 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 21:04:20.741755 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 21:04:20.741763 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 21:04:20.741770 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 21:04:20.741775 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 21:04:20.741780 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 21:04:20.741785 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 21:04:20.741790 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 21:04:20.741795 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 21:04:20.741800 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 21:04:20.741805 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 21:04:20.741810 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 21:04:20.741814 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 21:04:20.741821 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 21:04:20.741826 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 21:04:20.741831 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 21:04:20.741835 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 21:04:20.741840 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 21:04:20.741845 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 21:04:20.741850 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 21:04:20.741855 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 21:04:20.741860 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 21:04:20.741865 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 21:04:20.741871 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 21:04:20.741877 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 21:04:20.741881 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 21:04:20.741886 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 21:04:20.741891 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 21:04:20.741896 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 21:04:20.741901 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 21:04:20.741906 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 21:04:20.741911 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 21:04:20.741916 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 21:04:20.741922 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 21:04:20.741927 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 21:04:20.741932 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 21:04:20.741941 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 21:04:20.741948 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 21:04:20.741953 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 21:04:20.741958 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 21:04:20.741964 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 21:04:20.741970 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 21:04:20.741975 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 21:04:20.741981 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 21:04:20.741986 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 21:04:20.741992 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 21:04:20.741997 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 21:04:20.742002 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 21:04:20.742007 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 21:04:20.742013 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 21:04:20.742018 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 21:04:20.742023 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 21:04:20.742030 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 21:04:20.742035 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 21:04:20.742040 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 21:04:20.742046 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 21:04:20.742051 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 21:04:20.742056 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 21:04:20.742061 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 21:04:20.742067 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 21:04:20.742072 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 21:04:20.742078 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 21:04:20.742084 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 21:04:20.742089 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 21:04:20.742094 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 21:04:20.742100 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 21:04:20.742105 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 21:04:20.742110 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 21:04:20.742116 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 21:04:20.742124 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 21:04:20.742133 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 21:04:20.742140 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 21:04:20.742147 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 21:04:20.742152 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 21:04:20.742158 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 21:04:20.742163 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 21:04:20.742168 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 21:04:20.742174 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 21:04:20.742179 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 21:04:20.742540 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 21:04:20.742546 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 21:04:20.742552 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 21:04:20.742560 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 21:04:20.742565 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 21:04:20.742571 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 21:04:20.742576 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 21:04:20.742581 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 21:04:20.742586 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 21:04:20.742592 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 21:04:20.742597 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 21:04:20.742602 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 21:04:20.742607 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 21:04:20.742614 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 21:04:20.742619 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 21:04:20.742624 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 21:04:20.742630 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 21:04:20.742635 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 21:04:20.742641 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 21:04:20.742646 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 21:04:20.742651 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 21:04:20.742656 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 21:04:20.742662 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 21:04:20.742668 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 21:04:20.742673 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 21:04:20.742679 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 21:04:20.742684 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 21:04:20.742690 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 21:04:20.742696 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 21:04:20.742704 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 21:04:20.742710 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 21:04:20.742716 kernel: Zone ranges: Jan 13 21:04:20.742721 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 21:04:20.742728 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 21:04:20.742734 kernel: Normal empty Jan 13 21:04:20.742739 kernel: Movable zone start for each node Jan 13 21:04:20.742745 kernel: Early memory node ranges Jan 13 21:04:20.742750 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 21:04:20.742756 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 21:04:20.742761 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 21:04:20.742766 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 21:04:20.742772 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 21:04:20.742778 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 21:04:20.742784 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 21:04:20.742789 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 21:04:20.742795 kernel: system APIC only can use physical flat Jan 13 21:04:20.742800 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 21:04:20.742806 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 21:04:20.742814 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 21:04:20.742819 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 21:04:20.742825 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 21:04:20.742830 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 21:04:20.742837 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 21:04:20.742842 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 21:04:20.742848 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 21:04:20.742853 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 21:04:20.742862 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 21:04:20.742868 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 21:04:20.742874 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 21:04:20.742879 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 21:04:20.742884 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 21:04:20.742891 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 21:04:20.742897 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 21:04:20.742902 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 21:04:20.742907 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 21:04:20.742913 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 21:04:20.742918 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 21:04:20.742923 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 21:04:20.742929 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 21:04:20.742934 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 21:04:20.742939 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 21:04:20.742946 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 21:04:20.742952 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 21:04:20.742957 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 21:04:20.742962 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 21:04:20.742968 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 21:04:20.742973 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 21:04:20.742978 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 21:04:20.742983 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 21:04:20.742989 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 21:04:20.742994 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 21:04:20.743001 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 21:04:20.743006 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 21:04:20.743012 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 21:04:20.743017 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 21:04:20.743022 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 21:04:20.743028 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 21:04:20.743033 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 21:04:20.743038 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 21:04:20.743043 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 21:04:20.743050 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 21:04:20.743055 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 21:04:20.743060 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 21:04:20.743066 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 21:04:20.743071 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 21:04:20.743076 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 21:04:20.743082 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 21:04:20.743087 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 21:04:20.743092 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 21:04:20.743098 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 21:04:20.743104 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 21:04:20.743109 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 21:04:20.743115 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 21:04:20.743120 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 21:04:20.743126 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 21:04:20.743131 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 21:04:20.743136 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 21:04:20.743141 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 21:04:20.743147 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 21:04:20.743152 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 21:04:20.743158 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 21:04:20.743164 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 21:04:20.743169 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 21:04:20.743175 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 21:04:20.743180 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 21:04:20.743192 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 21:04:20.743198 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 21:04:20.743203 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 21:04:20.743208 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 21:04:20.743215 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 21:04:20.743225 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 21:04:20.743230 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 21:04:20.743238 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 21:04:20.743244 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 21:04:20.743249 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 21:04:20.743255 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 21:04:20.743260 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 21:04:20.743266 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 21:04:20.743271 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 21:04:20.743278 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 21:04:20.743283 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 21:04:20.743289 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 21:04:20.743294 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 21:04:20.743300 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 21:04:20.743305 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 21:04:20.743310 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 21:04:20.743315 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 21:04:20.743321 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 21:04:20.743327 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 21:04:20.743333 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 21:04:20.743338 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 21:04:20.743343 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 21:04:20.743349 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 21:04:20.743354 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 21:04:20.743359 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 21:04:20.743365 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 21:04:20.743370 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 21:04:20.743375 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 21:04:20.743382 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 21:04:20.743388 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 21:04:20.743395 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 21:04:20.743400 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 21:04:20.743409 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 21:04:20.743418 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 21:04:20.743425 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 21:04:20.743430 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 21:04:20.743436 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 21:04:20.743441 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 21:04:20.743448 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 21:04:20.743454 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 21:04:20.743459 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 21:04:20.743464 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 21:04:20.743470 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 21:04:20.743475 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 21:04:20.743480 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 21:04:20.743486 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 21:04:20.743491 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 21:04:20.743497 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 21:04:20.743503 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 21:04:20.743508 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 21:04:20.743513 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 21:04:20.743519 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 21:04:20.743524 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 21:04:20.743529 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 21:04:20.743534 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 21:04:20.743540 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 21:04:20.743545 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 21:04:20.743552 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 21:04:20.743558 kernel: TSC deadline timer available Jan 13 21:04:20.743563 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 21:04:20.743569 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 21:04:20.743574 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 21:04:20.743580 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 21:04:20.743585 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 21:04:20.743591 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 21:04:20.743596 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 21:04:20.743603 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 21:04:20.743608 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 21:04:20.743614 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 21:04:20.743619 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 21:04:20.743624 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 21:04:20.743638 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 21:04:20.743645 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 21:04:20.743650 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 21:04:20.743656 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 21:04:20.743663 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 21:04:20.743668 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 21:04:20.743674 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 21:04:20.743680 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 21:04:20.743685 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 21:04:20.743691 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 21:04:20.743697 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 21:04:20.743703 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 21:04:20.743711 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 21:04:20.743716 kernel: random: crng init done Jan 13 21:04:20.743722 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 21:04:20.743728 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 21:04:20.743734 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 21:04:20.743739 kernel: printk: log_buf_len: 1048576 bytes Jan 13 21:04:20.743745 kernel: printk: early log buf free: 239648(91%) Jan 13 21:04:20.743751 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 21:04:20.743760 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 21:04:20.743771 kernel: Fallback order for Node 0: 0 Jan 13 21:04:20.743781 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 21:04:20.743787 kernel: Policy zone: DMA32 Jan 13 21:04:20.743793 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 21:04:20.743799 kernel: Memory: 1936348K/2096628K available (12288K kernel code, 2299K rwdata, 22736K rodata, 42976K init, 2216K bss, 160020K reserved, 0K cma-reserved) Jan 13 21:04:20.743808 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 21:04:20.743813 kernel: ftrace: allocating 37920 entries in 149 pages Jan 13 21:04:20.743819 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 21:04:20.743825 kernel: Dynamic Preempt: voluntary Jan 13 21:04:20.743831 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 21:04:20.743837 kernel: rcu: RCU event tracing is enabled. Jan 13 21:04:20.743843 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 21:04:20.743849 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 21:04:20.743855 kernel: Rude variant of Tasks RCU enabled. Jan 13 21:04:20.743862 kernel: Tracing variant of Tasks RCU enabled. Jan 13 21:04:20.743868 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 21:04:20.743873 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 21:04:20.743879 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 21:04:20.743885 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 21:04:20.743891 kernel: Console: colour VGA+ 80x25 Jan 13 21:04:20.743896 kernel: printk: console [tty0] enabled Jan 13 21:04:20.743902 kernel: printk: console [ttyS0] enabled Jan 13 21:04:20.743908 kernel: ACPI: Core revision 20230628 Jan 13 21:04:20.743914 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 21:04:20.743921 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 21:04:20.743928 kernel: x2apic enabled Jan 13 21:04:20.743936 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 21:04:20.743942 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 21:04:20.743948 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 21:04:20.743954 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 21:04:20.743960 kernel: Disabled fast string operations Jan 13 21:04:20.743966 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 21:04:20.743975 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 21:04:20.743983 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 21:04:20.743989 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 21:04:20.743995 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 21:04:20.744001 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 21:04:20.744007 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 21:04:20.744013 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 21:04:20.744018 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 21:04:20.744024 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 21:04:20.744031 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 21:04:20.744037 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 21:04:20.744043 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 21:04:20.744049 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 21:04:20.744055 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 21:04:20.744061 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 21:04:20.744066 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 21:04:20.744072 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 21:04:20.744078 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 21:04:20.744085 kernel: Freeing SMP alternatives memory: 32K Jan 13 21:04:20.744091 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 21:04:20.744097 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 21:04:20.744102 kernel: landlock: Up and running. Jan 13 21:04:20.744108 kernel: SELinux: Initializing. Jan 13 21:04:20.744114 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.744120 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.744126 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 21:04:20.744132 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 21:04:20.744139 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 21:04:20.744145 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 21:04:20.744151 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 21:04:20.744157 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 21:04:20.744163 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 21:04:20.744168 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 21:04:20.744175 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 21:04:20.744180 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 21:04:20.744210 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 21:04:20.744218 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 21:04:20.744223 kernel: ... version: 1 Jan 13 21:04:20.744229 kernel: ... bit width: 48 Jan 13 21:04:20.744235 kernel: ... generic registers: 4 Jan 13 21:04:20.744241 kernel: ... value mask: 0000ffffffffffff Jan 13 21:04:20.744246 kernel: ... max period: 000000007fffffff Jan 13 21:04:20.744252 kernel: ... fixed-purpose events: 0 Jan 13 21:04:20.744258 kernel: ... event mask: 000000000000000f Jan 13 21:04:20.744264 kernel: signal: max sigframe size: 1776 Jan 13 21:04:20.744271 kernel: rcu: Hierarchical SRCU implementation. Jan 13 21:04:20.744277 kernel: rcu: Max phase no-delay instances is 400. Jan 13 21:04:20.744283 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 21:04:20.744291 kernel: smp: Bringing up secondary CPUs ... Jan 13 21:04:20.744297 kernel: smpboot: x86: Booting SMP configuration: Jan 13 21:04:20.744303 kernel: .... node #0, CPUs: #1 Jan 13 21:04:20.744309 kernel: Disabled fast string operations Jan 13 21:04:20.744314 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 21:04:20.744320 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 21:04:20.744331 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 21:04:20.744337 kernel: smpboot: Max logical packages: 128 Jan 13 21:04:20.744343 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 21:04:20.744348 kernel: devtmpfs: initialized Jan 13 21:04:20.744354 kernel: x86/mm: Memory block size: 128MB Jan 13 21:04:20.744360 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 21:04:20.744366 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 21:04:20.744373 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 21:04:20.744379 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 21:04:20.744386 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 21:04:20.744392 kernel: audit: initializing netlink subsys (disabled) Jan 13 21:04:20.744398 kernel: audit: type=2000 audit(1736802259.067:1): state=initialized audit_enabled=0 res=1 Jan 13 21:04:20.744404 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 21:04:20.744409 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 21:04:20.744415 kernel: cpuidle: using governor menu Jan 13 21:04:20.744421 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 21:04:20.744427 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 21:04:20.744433 kernel: dca service started, version 1.12.1 Jan 13 21:04:20.744439 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 21:04:20.744445 kernel: PCI: Using configuration type 1 for base access Jan 13 21:04:20.744451 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 21:04:20.744457 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 21:04:20.744463 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 21:04:20.744469 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 21:04:20.744475 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 21:04:20.744480 kernel: ACPI: Added _OSI(Module Device) Jan 13 21:04:20.744486 kernel: ACPI: Added _OSI(Processor Device) Jan 13 21:04:20.744495 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 21:04:20.744505 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 21:04:20.744514 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 21:04:20.744520 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 21:04:20.744526 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 21:04:20.744532 kernel: ACPI: Interpreter enabled Jan 13 21:04:20.744538 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 21:04:20.744544 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 21:04:20.744549 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 21:04:20.744557 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 21:04:20.744563 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 21:04:20.744569 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 21:04:20.744648 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 21:04:20.744705 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 21:04:20.744755 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 21:04:20.744764 kernel: PCI host bridge to bus 0000:00 Jan 13 21:04:20.744814 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 21:04:20.744872 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 21:04:20.744919 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 21:04:20.744964 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 21:04:20.745008 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 21:04:20.745056 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 21:04:20.745122 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 21:04:20.745187 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 21:04:20.745252 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 21:04:20.745308 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 21:04:20.745361 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 21:04:20.745426 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 21:04:20.745478 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 21:04:20.745531 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 21:04:20.745580 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 21:04:20.745641 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 21:04:20.745692 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 21:04:20.745742 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 21:04:20.745796 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 21:04:20.745862 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 21:04:20.745938 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 21:04:20.746002 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 21:04:20.746055 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 21:04:20.746108 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 21:04:20.746178 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 21:04:20.746263 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 21:04:20.746329 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 21:04:20.746388 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 21:04:20.746444 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746495 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746549 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746600 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746658 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746711 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746770 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746822 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746949 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747421 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747489 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747548 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747606 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747659 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747717 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747771 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747843 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747904 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747971 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748028 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748085 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748138 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748208 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748263 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748320 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748373 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748440 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748503 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748632 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748695 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748751 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748803 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748858 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748909 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748979 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749036 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749102 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749158 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749268 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749321 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749376 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749456 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749547 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749616 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749676 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749729 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749784 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749839 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749893 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749944 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750003 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750062 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750140 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750220 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750301 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750355 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750410 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750461 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750528 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750581 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750648 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750712 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750768 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750818 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750871 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 21:04:20.750923 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 21:04:20.750977 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 21:04:20.750986 kernel: acpiphp: Slot [32] registered Jan 13 21:04:20.750993 kernel: acpiphp: Slot [33] registered Jan 13 21:04:20.750999 kernel: acpiphp: Slot [34] registered Jan 13 21:04:20.751005 kernel: acpiphp: Slot [35] registered Jan 13 21:04:20.751014 kernel: acpiphp: Slot [36] registered Jan 13 21:04:20.751020 kernel: acpiphp: Slot [37] registered Jan 13 21:04:20.751026 kernel: acpiphp: Slot [38] registered Jan 13 21:04:20.751032 kernel: acpiphp: Slot [39] registered Jan 13 21:04:20.751040 kernel: acpiphp: Slot [40] registered Jan 13 21:04:20.751048 kernel: acpiphp: Slot [41] registered Jan 13 21:04:20.751056 kernel: acpiphp: Slot [42] registered Jan 13 21:04:20.751062 kernel: acpiphp: Slot [43] registered Jan 13 21:04:20.751068 kernel: acpiphp: Slot [44] registered Jan 13 21:04:20.751073 kernel: acpiphp: Slot [45] registered Jan 13 21:04:20.751079 kernel: acpiphp: Slot [46] registered Jan 13 21:04:20.751085 kernel: acpiphp: Slot [47] registered Jan 13 21:04:20.751091 kernel: acpiphp: Slot [48] registered Jan 13 21:04:20.751098 kernel: acpiphp: Slot [49] registered Jan 13 21:04:20.751104 kernel: acpiphp: Slot [50] registered Jan 13 21:04:20.751110 kernel: acpiphp: Slot [51] registered Jan 13 21:04:20.751115 kernel: acpiphp: Slot [52] registered Jan 13 21:04:20.751121 kernel: acpiphp: Slot [53] registered Jan 13 21:04:20.751127 kernel: acpiphp: Slot [54] registered Jan 13 21:04:20.751133 kernel: acpiphp: Slot [55] registered Jan 13 21:04:20.751139 kernel: acpiphp: Slot [56] registered Jan 13 21:04:20.751144 kernel: acpiphp: Slot [57] registered Jan 13 21:04:20.751150 kernel: acpiphp: Slot [58] registered Jan 13 21:04:20.751157 kernel: acpiphp: Slot [59] registered Jan 13 21:04:20.751163 kernel: acpiphp: Slot [60] registered Jan 13 21:04:20.751169 kernel: acpiphp: Slot [61] registered Jan 13 21:04:20.751176 kernel: acpiphp: Slot [62] registered Jan 13 21:04:20.751190 kernel: acpiphp: Slot [63] registered Jan 13 21:04:20.751257 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 21:04:20.751311 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 21:04:20.751360 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 21:04:20.751411 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 21:04:20.751463 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 21:04:20.751514 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 21:04:20.751564 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 21:04:20.751618 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 21:04:20.751668 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 21:04:20.751745 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 21:04:20.751816 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 21:04:20.751873 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 21:04:20.751926 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 21:04:20.751977 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.752033 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 21:04:20.752085 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 21:04:20.752138 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 21:04:20.752251 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 21:04:20.752321 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 21:04:20.752384 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 21:04:20.752435 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 21:04:20.752485 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 21:04:20.752563 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 21:04:20.752652 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 21:04:20.752733 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 21:04:20.752784 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 21:04:20.752842 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 21:04:20.752909 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 21:04:20.752961 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 21:04:20.753012 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 21:04:20.753062 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 21:04:20.753112 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 21:04:20.753166 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 21:04:20.753573 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 21:04:20.753628 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 21:04:20.753685 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 21:04:20.753745 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 21:04:20.753796 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 21:04:20.753853 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 21:04:20.753913 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 21:04:20.753964 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 21:04:20.754021 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 21:04:20.754074 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 21:04:20.754126 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 21:04:20.754177 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 21:04:20.754255 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 21:04:20.754308 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 21:04:20.754361 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 21:04:20.754416 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 21:04:20.754470 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 21:04:20.754521 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 21:04:20.754571 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 21:04:20.754621 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 21:04:20.754675 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 21:04:20.754725 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 21:04:20.754779 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 21:04:20.754836 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 21:04:20.754888 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 21:04:20.754952 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 21:04:20.755004 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 21:04:20.755054 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 21:04:20.755108 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 21:04:20.755158 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 21:04:20.755238 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 21:04:20.755296 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 21:04:20.755355 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 21:04:20.755409 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 21:04:20.755477 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 21:04:20.755527 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 21:04:20.757225 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 21:04:20.757296 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 21:04:20.757354 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 21:04:20.757410 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 21:04:20.757470 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 21:04:20.757523 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 21:04:20.757573 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 21:04:20.757629 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 21:04:20.757690 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 21:04:20.757741 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 21:04:20.757791 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 21:04:20.757868 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 21:04:20.757922 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 21:04:20.757976 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 21:04:20.758039 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 21:04:20.758096 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 21:04:20.758147 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 21:04:20.758233 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 21:04:20.758287 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 21:04:20.758339 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 21:04:20.758389 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 21:04:20.758440 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 21:04:20.758492 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 21:04:20.758550 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 21:04:20.758602 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 21:04:20.758655 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 21:04:20.758724 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 21:04:20.758778 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 21:04:20.758831 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 21:04:20.758882 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 21:04:20.758932 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 21:04:20.758987 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 21:04:20.759039 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 21:04:20.759106 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 21:04:20.759161 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 21:04:20.761240 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 21:04:20.761300 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 21:04:20.761354 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 21:04:20.761409 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 21:04:20.761464 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 21:04:20.761515 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 21:04:20.761568 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 21:04:20.761620 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 21:04:20.761671 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 21:04:20.761721 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 21:04:20.761773 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 21:04:20.761826 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 21:04:20.761875 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 21:04:20.761927 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 21:04:20.761977 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 21:04:20.762027 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 21:04:20.762079 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 21:04:20.762129 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 21:04:20.762178 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 21:04:20.762241 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 21:04:20.762292 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 21:04:20.762342 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 21:04:20.762393 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 21:04:20.762443 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 21:04:20.762493 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 21:04:20.762502 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 21:04:20.762508 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 21:04:20.762514 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 21:04:20.762522 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 21:04:20.762528 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 21:04:20.762534 kernel: iommu: Default domain type: Translated Jan 13 21:04:20.762540 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 21:04:20.762546 kernel: PCI: Using ACPI for IRQ routing Jan 13 21:04:20.762552 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 21:04:20.762559 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 21:04:20.762565 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 21:04:20.762614 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 21:04:20.762667 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 21:04:20.762718 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 21:04:20.762727 kernel: vgaarb: loaded Jan 13 21:04:20.762734 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 21:04:20.762740 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 21:04:20.762746 kernel: clocksource: Switched to clocksource tsc-early Jan 13 21:04:20.762752 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 21:04:20.762758 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 21:04:20.762764 kernel: pnp: PnP ACPI init Jan 13 21:04:20.762822 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 21:04:20.762870 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 21:04:20.762916 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 21:04:20.762965 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 21:04:20.763015 kernel: pnp 00:06: [dma 2] Jan 13 21:04:20.763064 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 21:04:20.763113 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 21:04:20.763158 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 21:04:20.763167 kernel: pnp: PnP ACPI: found 8 devices Jan 13 21:04:20.763174 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 21:04:20.763179 kernel: NET: Registered PF_INET protocol family Jan 13 21:04:20.764595 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 21:04:20.764602 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 21:04:20.764611 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 21:04:20.764620 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 21:04:20.764626 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 21:04:20.764632 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 21:04:20.764638 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.764644 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.764650 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 21:04:20.764656 kernel: NET: Registered PF_XDP protocol family Jan 13 21:04:20.764722 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 21:04:20.764782 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 21:04:20.764837 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 21:04:20.764890 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 21:04:20.764943 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 21:04:20.764995 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 21:04:20.765048 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 21:04:20.765103 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 21:04:20.765155 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 21:04:20.765221 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 21:04:20.765274 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 21:04:20.765326 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 21:04:20.765377 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 21:04:20.765431 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 21:04:20.765484 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 21:04:20.765535 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 21:04:20.765586 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 21:04:20.765637 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 21:04:20.765687 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 21:04:20.765740 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 21:04:20.765791 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 21:04:20.765840 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 21:04:20.765890 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 21:04:20.765941 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 21:04:20.765992 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 21:04:20.766045 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.766094 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.766145 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768215 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768271 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768322 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768372 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768422 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768476 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768527 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768577 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768627 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768677 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768727 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768777 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768827 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768880 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768930 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768979 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.769029 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.769080 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.769130 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.769179 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.769237 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.769290 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770280 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770334 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770384 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770434 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770484 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770534 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770585 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770638 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770687 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770737 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770786 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770836 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770886 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770935 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770984 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.771054 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772241 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772305 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772356 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772406 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772455 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772506 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772555 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772604 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772657 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772706 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772756 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772806 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772855 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772906 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772956 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773005 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773054 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773104 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773157 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773214 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773269 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773319 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773369 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773419 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773468 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773518 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773568 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773622 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773671 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773721 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773770 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773820 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773870 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773920 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773969 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.774019 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.774068 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.774120 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.774169 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.776878 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.776931 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.776982 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.777032 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.777083 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 21:04:20.777134 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 21:04:20.777235 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 21:04:20.777296 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 21:04:20.777346 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 21:04:20.777401 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 21:04:20.777451 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 21:04:20.777501 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 21:04:20.777551 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 21:04:20.777601 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 21:04:20.777652 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 21:04:20.777704 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 21:04:20.777754 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 21:04:20.777804 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 21:04:20.777870 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 21:04:20.777921 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 21:04:20.777971 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 21:04:20.778020 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 21:04:20.778070 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 21:04:20.778119 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 21:04:20.778170 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 21:04:20.778231 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 21:04:20.778281 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 21:04:20.778331 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 21:04:20.778383 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 21:04:20.778433 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 21:04:20.778482 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 21:04:20.778535 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 21:04:20.778585 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 21:04:20.778634 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 21:04:20.778684 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 21:04:20.778734 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 21:04:20.778784 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 21:04:20.778837 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 21:04:20.778888 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 21:04:20.778937 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 21:04:20.778990 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 21:04:20.779040 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 21:04:20.779091 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 21:04:20.779142 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 21:04:20.779202 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 21:04:20.779269 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 21:04:20.779321 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 21:04:20.779371 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 21:04:20.779421 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 21:04:20.779475 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 21:04:20.779525 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 21:04:20.779575 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 21:04:20.779626 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 21:04:20.779677 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 21:04:20.779728 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 21:04:20.779778 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 21:04:20.779829 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 21:04:20.779879 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 21:04:20.779930 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 21:04:20.779982 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 21:04:20.780032 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 21:04:20.780082 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 21:04:20.780132 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 21:04:20.780196 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 21:04:20.780257 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 21:04:20.780308 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 21:04:20.780358 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 21:04:20.780408 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 21:04:20.780460 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 21:04:20.780511 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 21:04:20.780562 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 21:04:20.780612 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 21:04:20.780662 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 21:04:20.780713 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 21:04:20.780763 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 21:04:20.780813 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 21:04:20.780865 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 21:04:20.780916 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 21:04:20.780970 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 21:04:20.781022 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 21:04:20.781072 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 21:04:20.782277 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 21:04:20.782335 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 21:04:20.782388 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 21:04:20.782439 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 21:04:20.782491 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 21:04:20.782541 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 21:04:20.782595 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 21:04:20.782646 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 21:04:20.782695 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 21:04:20.782746 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 21:04:20.782795 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 21:04:20.782847 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 21:04:20.782897 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 21:04:20.782947 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 21:04:20.782997 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 21:04:20.783048 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 21:04:20.783100 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 21:04:20.783150 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 21:04:20.783213 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 21:04:20.783274 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 21:04:20.783325 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 21:04:20.783374 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 21:04:20.783424 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 21:04:20.783474 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 21:04:20.783524 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 21:04:20.783577 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 21:04:20.783628 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 21:04:20.783678 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 21:04:20.783728 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 21:04:20.783778 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 21:04:20.783828 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 21:04:20.783878 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 21:04:20.783929 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 21:04:20.783979 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 21:04:20.784030 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 21:04:20.784083 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 21:04:20.784133 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 21:04:20.785193 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 21:04:20.785254 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 21:04:20.785303 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 21:04:20.785348 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 21:04:20.785393 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 21:04:20.785442 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 21:04:20.785493 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 21:04:20.785539 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 21:04:20.785585 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 21:04:20.785630 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 21:04:20.785677 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 21:04:20.785723 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 21:04:20.785768 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 21:04:20.785822 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 21:04:20.785869 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 21:04:20.785915 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 21:04:20.785966 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 21:04:20.786012 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 21:04:20.786058 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 21:04:20.786107 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 21:04:20.786156 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 21:04:20.786532 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 21:04:20.786589 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 21:04:20.786637 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 21:04:20.786690 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 21:04:20.786737 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 21:04:20.786790 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 21:04:20.786838 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 21:04:20.786888 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 21:04:20.786935 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 21:04:20.786987 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 21:04:20.787043 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 21:04:20.787097 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 21:04:20.787145 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 21:04:20.787230 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 21:04:20.787287 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 21:04:20.787335 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 21:04:20.787383 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 21:04:20.787434 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 21:04:20.787485 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 21:04:20.787534 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 21:04:20.787585 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 21:04:20.787632 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 21:04:20.787685 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 21:04:20.787732 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 21:04:20.787784 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 21:04:20.787831 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 21:04:20.787885 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 21:04:20.787931 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 21:04:20.787981 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 21:04:20.788028 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 21:04:20.788080 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 21:04:20.788127 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 21:04:20.788174 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 21:04:20.788279 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 21:04:20.788327 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 21:04:20.788374 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 21:04:20.788424 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 21:04:20.788474 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 21:04:20.788520 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 21:04:20.788570 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 21:04:20.788618 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 21:04:20.788667 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 21:04:20.788714 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 21:04:20.788770 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 21:04:20.788817 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 21:04:20.788867 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 21:04:20.788914 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 21:04:20.788965 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 21:04:20.789012 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 21:04:20.789065 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 21:04:20.789112 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 21:04:20.789159 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 21:04:20.789217 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 21:04:20.789266 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 21:04:20.789312 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 21:04:20.789362 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 21:04:20.789412 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 21:04:20.789464 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 21:04:20.789511 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 21:04:20.789561 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 21:04:20.789609 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 21:04:20.789661 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 21:04:20.789711 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 21:04:20.789761 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 21:04:20.789811 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 21:04:20.789861 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 21:04:20.789909 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 21:04:20.789965 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 21:04:20.789977 kernel: PCI: CLS 32 bytes, default 64 Jan 13 21:04:20.789984 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 21:04:20.789991 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 21:04:20.789997 kernel: clocksource: Switched to clocksource tsc Jan 13 21:04:20.790003 kernel: Initialise system trusted keyrings Jan 13 21:04:20.790010 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 21:04:20.790016 kernel: Key type asymmetric registered Jan 13 21:04:20.790022 kernel: Asymmetric key parser 'x509' registered Jan 13 21:04:20.790028 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 21:04:20.790036 kernel: io scheduler mq-deadline registered Jan 13 21:04:20.790043 kernel: io scheduler kyber registered Jan 13 21:04:20.790049 kernel: io scheduler bfq registered Jan 13 21:04:20.790102 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 21:04:20.790155 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790556 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 21:04:20.790614 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790668 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 21:04:20.790724 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790776 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 21:04:20.790828 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790879 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 21:04:20.790930 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790981 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 21:04:20.791036 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791087 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 21:04:20.791139 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791424 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 21:04:20.791480 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791536 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 21:04:20.791588 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791640 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 21:04:20.791692 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791743 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 21:04:20.791795 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791846 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 21:04:20.791899 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791950 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 21:04:20.792002 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.792053 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 21:04:20.792104 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.792159 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 21:04:20.792304 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793280 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 21:04:20.793341 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793395 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 21:04:20.793448 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793503 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 21:04:20.793554 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793605 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 21:04:20.793656 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793707 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 21:04:20.793758 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793808 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 21:04:20.793862 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793913 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 21:04:20.793964 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.794014 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 21:04:20.794066 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.794120 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 21:04:20.794172 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795270 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 21:04:20.795328 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795382 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 21:04:20.795434 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795489 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 21:04:20.795539 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795590 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 21:04:20.795641 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795693 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 21:04:20.795747 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795798 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 21:04:20.795849 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795899 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 21:04:20.795950 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.796000 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 21:04:20.796054 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.796064 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 21:04:20.796070 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 21:04:20.796077 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 21:04:20.796083 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 21:04:20.796090 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 21:04:20.796096 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 21:04:20.796149 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 21:04:20.796534 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T21:04:20 UTC (1736802260) Jan 13 21:04:20.796587 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 21:04:20.796597 kernel: intel_pstate: CPU model not supported Jan 13 21:04:20.796604 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 21:04:20.796610 kernel: NET: Registered PF_INET6 protocol family Jan 13 21:04:20.796617 kernel: Segment Routing with IPv6 Jan 13 21:04:20.796623 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 21:04:20.796632 kernel: NET: Registered PF_PACKET protocol family Jan 13 21:04:20.796638 kernel: Key type dns_resolver registered Jan 13 21:04:20.796645 kernel: IPI shorthand broadcast: enabled Jan 13 21:04:20.796651 kernel: sched_clock: Marking stable (885446644, 225799539)->(1171383470, -60137287) Jan 13 21:04:20.796657 kernel: registered taskstats version 1 Jan 13 21:04:20.796663 kernel: Loading compiled-in X.509 certificates Jan 13 21:04:20.796669 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 98739e9049f62881f4df7ffd1e39335f7f55b344' Jan 13 21:04:20.796676 kernel: Key type .fscrypt registered Jan 13 21:04:20.796683 kernel: Key type fscrypt-provisioning registered Jan 13 21:04:20.796690 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 21:04:20.796697 kernel: ima: Allocated hash algorithm: sha1 Jan 13 21:04:20.796703 kernel: ima: No architecture policies found Jan 13 21:04:20.796709 kernel: clk: Disabling unused clocks Jan 13 21:04:20.796715 kernel: Freeing unused kernel image (initmem) memory: 42976K Jan 13 21:04:20.796723 kernel: Write protecting the kernel read-only data: 36864k Jan 13 21:04:20.796729 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Jan 13 21:04:20.796736 kernel: Run /init as init process Jan 13 21:04:20.796742 kernel: with arguments: Jan 13 21:04:20.796750 kernel: /init Jan 13 21:04:20.796756 kernel: with environment: Jan 13 21:04:20.796762 kernel: HOME=/ Jan 13 21:04:20.796768 kernel: TERM=linux Jan 13 21:04:20.796774 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 21:04:20.796781 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 21:04:20.796789 systemd[1]: Detected virtualization vmware. Jan 13 21:04:20.796796 systemd[1]: Detected architecture x86-64. Jan 13 21:04:20.796803 systemd[1]: Running in initrd. Jan 13 21:04:20.796810 systemd[1]: No hostname configured, using default hostname. Jan 13 21:04:20.796817 systemd[1]: Hostname set to . Jan 13 21:04:20.796824 systemd[1]: Initializing machine ID from random generator. Jan 13 21:04:20.796830 systemd[1]: Queued start job for default target initrd.target. Jan 13 21:04:20.796836 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:04:20.796843 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:04:20.796850 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 21:04:20.796858 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 21:04:20.796865 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 21:04:20.796871 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 21:04:20.796879 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 21:04:20.796886 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 21:04:20.796892 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:04:20.796899 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:04:20.796907 systemd[1]: Reached target paths.target - Path Units. Jan 13 21:04:20.796913 systemd[1]: Reached target slices.target - Slice Units. Jan 13 21:04:20.796920 systemd[1]: Reached target swap.target - Swaps. Jan 13 21:04:20.796926 systemd[1]: Reached target timers.target - Timer Units. Jan 13 21:04:20.796933 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 21:04:20.796939 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 21:04:20.796945 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 21:04:20.796952 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 21:04:20.796958 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:04:20.796966 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 21:04:20.796973 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:04:20.796979 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 21:04:20.796986 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 21:04:20.796992 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 21:04:20.796999 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 21:04:20.797005 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 21:04:20.797012 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 21:04:20.797019 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 21:04:20.797026 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:04:20.797230 systemd-journald[217]: Collecting audit messages is disabled. Jan 13 21:04:20.797250 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 21:04:20.797260 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:04:20.797266 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 21:04:20.797273 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 21:04:20.797280 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 21:04:20.797288 kernel: Bridge firewalling registered Jan 13 21:04:20.797295 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 21:04:20.797302 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 21:04:20.797309 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:20.797315 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:04:20.797322 systemd-journald[217]: Journal started Jan 13 21:04:20.797337 systemd-journald[217]: Runtime Journal (/run/log/journal/1d073833aaed44bc9a4bb9c5dbd400a3) is 4.8M, max 38.7M, 33.8M free. Jan 13 21:04:20.748649 systemd-modules-load[218]: Inserted module 'overlay' Jan 13 21:04:20.799657 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:04:20.772263 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 13 21:04:20.802319 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 21:04:20.802341 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 21:04:20.802640 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:04:20.808329 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 21:04:20.808541 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:04:20.818415 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:20.820280 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 21:04:20.822376 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:04:20.824330 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 21:04:20.829090 dracut-cmdline[248]: dracut-dracut-053 Jan 13 21:04:20.832988 dracut-cmdline[248]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 21:04:20.847408 systemd-resolved[250]: Positive Trust Anchors: Jan 13 21:04:20.847418 systemd-resolved[250]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 21:04:20.847446 systemd-resolved[250]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 21:04:20.849187 systemd-resolved[250]: Defaulting to hostname 'linux'. Jan 13 21:04:20.849859 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 21:04:20.850392 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:04:20.882199 kernel: SCSI subsystem initialized Jan 13 21:04:20.888191 kernel: Loading iSCSI transport class v2.0-870. Jan 13 21:04:20.895196 kernel: iscsi: registered transport (tcp) Jan 13 21:04:20.908454 kernel: iscsi: registered transport (qla4xxx) Jan 13 21:04:20.908519 kernel: QLogic iSCSI HBA Driver Jan 13 21:04:20.930718 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 21:04:20.935307 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 21:04:20.950283 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 21:04:20.950321 kernel: device-mapper: uevent: version 1.0.3 Jan 13 21:04:20.951341 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 21:04:20.983207 kernel: raid6: avx2x4 gen() 52252 MB/s Jan 13 21:04:20.999239 kernel: raid6: avx2x2 gen() 52497 MB/s Jan 13 21:04:21.016406 kernel: raid6: avx2x1 gen() 44594 MB/s Jan 13 21:04:21.016448 kernel: raid6: using algorithm avx2x2 gen() 52497 MB/s Jan 13 21:04:21.034396 kernel: raid6: .... xor() 31350 MB/s, rmw enabled Jan 13 21:04:21.034416 kernel: raid6: using avx2x2 recovery algorithm Jan 13 21:04:21.048200 kernel: xor: automatically using best checksumming function avx Jan 13 21:04:21.146216 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 21:04:21.151467 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 21:04:21.157283 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:04:21.164977 systemd-udevd[433]: Using default interface naming scheme 'v255'. Jan 13 21:04:21.167551 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:04:21.173301 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 21:04:21.180390 dracut-pre-trigger[438]: rd.md=0: removing MD RAID activation Jan 13 21:04:21.197683 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 21:04:21.201265 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 21:04:21.274810 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:04:21.279374 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 21:04:21.286358 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 21:04:21.287438 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 21:04:21.287762 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:04:21.288131 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 21:04:21.292297 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 21:04:21.301411 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 21:04:21.336197 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 21:04:21.339236 kernel: vmw_pvscsi: using 64bit dma Jan 13 21:04:21.341192 kernel: vmw_pvscsi: max_id: 16 Jan 13 21:04:21.341209 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 21:04:21.349215 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 21:04:21.349246 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 21:04:21.349255 kernel: vmw_pvscsi: using MSI-X Jan 13 21:04:21.349262 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 21:04:21.358908 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 21:04:21.360994 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 21:04:21.361074 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 21:04:21.363241 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 21:04:21.369159 kernel: libata version 3.00 loaded. Jan 13 21:04:21.369170 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 21:04:21.375762 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 21:04:21.375837 kernel: scsi host1: ata_piix Jan 13 21:04:21.375902 kernel: scsi host2: ata_piix Jan 13 21:04:21.375966 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 21:04:21.375974 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 21:04:21.377292 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 21:04:21.381191 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 21:04:21.385176 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 21:04:21.385217 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:21.385628 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:04:21.385858 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 21:04:21.385884 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:21.386100 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:04:21.394314 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:04:21.407159 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:21.411398 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:04:21.423762 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:21.546214 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 21:04:21.552199 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 21:04:21.563399 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 21:04:21.563437 kernel: AES CTR mode by8 optimization enabled Jan 13 21:04:21.573418 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 21:04:21.592080 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 21:04:21.592384 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 21:04:21.592722 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 21:04:21.592811 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 21:04:21.592917 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 21:04:21.593024 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 21:04:21.593035 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 21:04:21.593115 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:21.593124 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 21:04:21.628194 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (496) Jan 13 21:04:21.628233 kernel: BTRFS: device fsid 5e7921ba-229a-48a0-bc77-9b30aaa34aeb devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (499) Jan 13 21:04:21.628271 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 21:04:21.633865 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 21:04:21.636994 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 21:04:21.639643 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 21:04:21.639794 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 21:04:21.647278 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 21:04:21.693213 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:21.698209 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:22.700199 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:22.700641 disk-uuid[595]: The operation has completed successfully. Jan 13 21:04:22.734875 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 21:04:22.735206 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 21:04:22.740263 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 21:04:22.742373 sh[611]: Success Jan 13 21:04:22.750200 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 21:04:22.811483 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 21:04:22.812545 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 21:04:22.812878 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 21:04:22.827906 kernel: BTRFS info (device dm-0): first mount of filesystem 5e7921ba-229a-48a0-bc77-9b30aaa34aeb Jan 13 21:04:22.827942 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:22.827951 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 21:04:22.828994 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 21:04:22.830482 kernel: BTRFS info (device dm-0): using free space tree Jan 13 21:04:22.837212 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 21:04:22.839490 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 21:04:22.847291 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 21:04:22.848559 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 21:04:22.870206 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:22.872199 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:22.872219 kernel: BTRFS info (device sda6): using free space tree Jan 13 21:04:22.889398 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 21:04:22.893803 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 21:04:22.895250 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:22.897427 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 21:04:22.901682 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 21:04:22.913030 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 21:04:22.919542 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 21:04:22.983142 ignition[671]: Ignition 2.20.0 Jan 13 21:04:22.983283 ignition[671]: Stage: fetch-offline Jan 13 21:04:22.983165 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 21:04:22.983302 ignition[671]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:22.983307 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:22.983363 ignition[671]: parsed url from cmdline: "" Jan 13 21:04:22.983365 ignition[671]: no config URL provided Jan 13 21:04:22.983368 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 21:04:22.983373 ignition[671]: no config at "/usr/lib/ignition/user.ign" Jan 13 21:04:22.983717 ignition[671]: config successfully fetched Jan 13 21:04:22.983734 ignition[671]: parsing config with SHA512: 9afce9029d2d8820adcac0b0a5955968a6e94aa2ce22de805e79d3bbcf06f1dced2f1348b154596a909187013da21f53eaf1a837169882392d24d6252a22fc5b Jan 13 21:04:22.987683 unknown[671]: fetched base config from "system" Jan 13 21:04:22.987950 ignition[671]: fetch-offline: fetch-offline passed Jan 13 21:04:22.987688 unknown[671]: fetched user config from "vmware" Jan 13 21:04:22.987995 ignition[671]: Ignition finished successfully Jan 13 21:04:22.992848 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 21:04:22.993100 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 21:04:23.004644 systemd-networkd[804]: lo: Link UP Jan 13 21:04:23.004650 systemd-networkd[804]: lo: Gained carrier Jan 13 21:04:23.005325 systemd-networkd[804]: Enumeration completed Jan 13 21:04:23.005373 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 21:04:23.005515 systemd[1]: Reached target network.target - Network. Jan 13 21:04:23.005763 systemd-networkd[804]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 21:04:23.006378 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 21:04:23.009722 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 21:04:23.009842 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 21:04:23.010032 systemd-networkd[804]: ens192: Link UP Jan 13 21:04:23.010037 systemd-networkd[804]: ens192: Gained carrier Jan 13 21:04:23.010506 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 21:04:23.018881 ignition[807]: Ignition 2.20.0 Jan 13 21:04:23.018889 ignition[807]: Stage: kargs Jan 13 21:04:23.018983 ignition[807]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.018989 ignition[807]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.019521 ignition[807]: kargs: kargs passed Jan 13 21:04:23.019549 ignition[807]: Ignition finished successfully Jan 13 21:04:23.020616 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 21:04:23.025298 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 21:04:23.032432 ignition[814]: Ignition 2.20.0 Jan 13 21:04:23.032439 ignition[814]: Stage: disks Jan 13 21:04:23.032540 ignition[814]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.032546 ignition[814]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.033126 ignition[814]: disks: disks passed Jan 13 21:04:23.033155 ignition[814]: Ignition finished successfully Jan 13 21:04:23.033778 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 21:04:23.034293 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 21:04:23.034528 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 21:04:23.034761 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 21:04:23.034940 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 21:04:23.035153 systemd[1]: Reached target basic.target - Basic System. Jan 13 21:04:23.042305 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 21:04:23.052375 systemd-fsck[822]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 21:04:23.053479 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 21:04:23.057261 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 21:04:23.111375 kernel: EXT4-fs (sda9): mounted filesystem 84bcd1b2-5573-4e91-8fd5-f97782397085 r/w with ordered data mode. Quota mode: none. Jan 13 21:04:23.111711 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 21:04:23.112103 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 21:04:23.124288 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 21:04:23.127008 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 21:04:23.127327 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 21:04:23.127353 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 21:04:23.127366 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 21:04:23.130142 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 21:04:23.131507 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 21:04:23.133195 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (830) Jan 13 21:04:23.137337 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:23.137362 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:23.137371 kernel: BTRFS info (device sda6): using free space tree Jan 13 21:04:23.143194 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 21:04:23.144129 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 21:04:23.161995 initrd-setup-root[854]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 21:04:23.182009 initrd-setup-root[861]: cut: /sysroot/etc/group: No such file or directory Jan 13 21:04:23.190878 initrd-setup-root[868]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 21:04:23.192801 initrd-setup-root[875]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 21:04:23.248860 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 21:04:23.254264 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 21:04:23.256706 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 21:04:23.261199 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:23.272317 ignition[942]: INFO : Ignition 2.20.0 Jan 13 21:04:23.272317 ignition[942]: INFO : Stage: mount Jan 13 21:04:23.272317 ignition[942]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.272317 ignition[942]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.273383 ignition[942]: INFO : mount: mount passed Jan 13 21:04:23.273383 ignition[942]: INFO : Ignition finished successfully Jan 13 21:04:23.273636 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 21:04:23.279308 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 21:04:23.279536 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 21:04:23.826420 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 21:04:23.831316 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 21:04:23.839224 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (955) Jan 13 21:04:23.842382 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:23.842400 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:23.842409 kernel: BTRFS info (device sda6): using free space tree Jan 13 21:04:23.848192 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 21:04:23.848423 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 21:04:23.862278 ignition[972]: INFO : Ignition 2.20.0 Jan 13 21:04:23.862278 ignition[972]: INFO : Stage: files Jan 13 21:04:23.863216 ignition[972]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.863216 ignition[972]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.863216 ignition[972]: DEBUG : files: compiled without relabeling support, skipping Jan 13 21:04:23.863653 ignition[972]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 21:04:23.863653 ignition[972]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 21:04:23.865616 ignition[972]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 21:04:23.865755 ignition[972]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 21:04:23.865887 ignition[972]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 21:04:23.865827 unknown[972]: wrote ssh authorized keys file for user: core Jan 13 21:04:23.867634 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Jan 13 21:04:23.867793 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Jan 13 21:04:23.867793 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 21:04:23.867793 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 21:04:23.906529 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Jan 13 21:04:24.037968 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 21:04:24.037968 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Jan 13 21:04:24.543332 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Jan 13 21:04:24.839763 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.840039 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 21:04:24.840039 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 21:04:24.840039 ignition[972]: INFO : files: op(d): [started] processing unit "containerd.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(d): op(e): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(d): [finished] processing unit "containerd.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): [started] processing unit "prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): op(10): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): op(10): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): [finished] processing unit "prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): [started] processing unit "coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): op(12): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): op(12): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): [finished] processing unit "coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(13): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 21:04:24.878315 ignition[972]: INFO : files: op(13): op(14): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(13): op(14): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(13): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(15): [started] setting preset to enabled for "prepare-helm.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(15): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: createResultFile: createFiles: op(16): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 21:04:24.881125 ignition[972]: INFO : files: createResultFile: createFiles: op(16): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 21:04:24.881125 ignition[972]: INFO : files: files passed Jan 13 21:04:24.881125 ignition[972]: INFO : Ignition finished successfully Jan 13 21:04:24.882765 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 21:04:24.887299 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 21:04:24.888885 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 21:04:24.889479 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 21:04:24.889667 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 21:04:24.894891 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:04:24.894891 initrd-setup-root-after-ignition[1002]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:04:24.895805 initrd-setup-root-after-ignition[1006]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:04:24.896487 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 21:04:24.896825 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 21:04:24.900271 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 21:04:24.916407 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 21:04:24.916467 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 21:04:24.916749 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 21:04:24.916870 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 21:04:24.917064 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 21:04:24.917514 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 21:04:24.926760 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 21:04:24.930308 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 21:04:24.935918 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:04:24.936270 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:04:24.936535 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 21:04:24.936836 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 21:04:24.936935 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 21:04:24.937593 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 21:04:24.937862 systemd[1]: Stopped target basic.target - Basic System. Jan 13 21:04:24.938081 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 21:04:24.938372 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 21:04:24.938616 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 21:04:24.938889 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 21:04:24.939167 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 21:04:24.939438 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 21:04:24.939712 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 21:04:24.939934 systemd[1]: Stopped target swap.target - Swaps. Jan 13 21:04:24.940138 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 21:04:24.940333 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 21:04:24.940683 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:04:24.940960 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:04:24.941197 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 21:04:24.941359 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:04:24.941624 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 21:04:24.941688 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 21:04:24.942096 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 21:04:24.942163 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 21:04:24.942580 systemd[1]: Stopped target paths.target - Path Units. Jan 13 21:04:24.942795 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 21:04:24.942967 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:04:24.943282 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 21:04:24.943516 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 21:04:24.943748 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 21:04:24.943801 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 21:04:24.944142 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 21:04:24.944205 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 21:04:24.944467 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 21:04:24.944536 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 21:04:24.945051 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 21:04:24.945115 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 21:04:24.953294 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 21:04:24.955303 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 21:04:24.955405 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 21:04:24.955475 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:04:24.955701 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 21:04:24.955757 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 21:04:24.957583 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 21:04:24.957730 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 21:04:24.964943 ignition[1027]: INFO : Ignition 2.20.0 Jan 13 21:04:24.964943 ignition[1027]: INFO : Stage: umount Jan 13 21:04:24.965302 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:24.965302 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:24.966140 ignition[1027]: INFO : umount: umount passed Jan 13 21:04:24.966266 ignition[1027]: INFO : Ignition finished successfully Jan 13 21:04:24.967779 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 21:04:24.968098 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 21:04:24.968148 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 21:04:24.968848 systemd[1]: Stopped target network.target - Network. Jan 13 21:04:24.969071 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 21:04:24.969208 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 21:04:24.969440 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 21:04:24.969462 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 21:04:24.969691 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 21:04:24.969711 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 21:04:24.970216 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 21:04:24.970242 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 21:04:24.970417 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 21:04:24.970561 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 21:04:24.975119 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 21:04:24.975206 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 21:04:24.975500 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 21:04:24.975527 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:04:24.979250 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 21:04:24.979372 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 21:04:24.979402 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 21:04:24.980304 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 21:04:24.980328 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 21:04:24.980522 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:04:24.981034 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 21:04:24.981267 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 21:04:24.983917 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 21:04:24.983960 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:04:24.984894 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 21:04:24.985019 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 21:04:24.985343 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 21:04:24.985368 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:04:24.987415 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 21:04:24.987468 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 21:04:24.996516 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 21:04:24.996593 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:04:24.996887 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 21:04:24.996913 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 21:04:24.997120 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 21:04:24.997136 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:04:24.997447 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 21:04:24.997470 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 21:04:24.997733 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 21:04:24.997754 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 21:04:24.998033 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 21:04:24.998056 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:25.001310 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 21:04:25.001417 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 21:04:25.001442 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:04:25.001566 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 21:04:25.001589 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:04:25.001708 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 21:04:25.001729 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:04:25.001846 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 21:04:25.001867 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:25.004255 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 21:04:25.004321 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 21:04:25.016774 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 21:04:25.016833 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 21:04:25.017085 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 21:04:25.017221 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 21:04:25.017247 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 21:04:25.020292 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 21:04:25.036861 systemd[1]: Switching root. Jan 13 21:04:25.061073 systemd-journald[217]: Journal stopped Jan 13 21:04:20.741302 kernel: Linux version 6.6.71-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 13.3.1_p20240614 p1) 13.3.1 20240614, GNU ld (Gentoo 2.42 p6) 2.42.0) #1 SMP PREEMPT_DYNAMIC Mon Jan 13 19:01:45 -00 2025 Jan 13 21:04:20.741318 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 21:04:20.741325 kernel: Disabled fast string operations Jan 13 21:04:20.741329 kernel: BIOS-provided physical RAM map: Jan 13 21:04:20.741333 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ebff] usable Jan 13 21:04:20.741337 kernel: BIOS-e820: [mem 0x000000000009ec00-0x000000000009ffff] reserved Jan 13 21:04:20.741343 kernel: BIOS-e820: [mem 0x00000000000dc000-0x00000000000fffff] reserved Jan 13 21:04:20.741347 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007fedffff] usable Jan 13 21:04:20.741351 kernel: BIOS-e820: [mem 0x000000007fee0000-0x000000007fefefff] ACPI data Jan 13 21:04:20.741355 kernel: BIOS-e820: [mem 0x000000007feff000-0x000000007fefffff] ACPI NVS Jan 13 21:04:20.741359 kernel: BIOS-e820: [mem 0x000000007ff00000-0x000000007fffffff] usable Jan 13 21:04:20.741364 kernel: BIOS-e820: [mem 0x00000000f0000000-0x00000000f7ffffff] reserved Jan 13 21:04:20.741368 kernel: BIOS-e820: [mem 0x00000000fec00000-0x00000000fec0ffff] reserved Jan 13 21:04:20.741372 kernel: BIOS-e820: [mem 0x00000000fee00000-0x00000000fee00fff] reserved Jan 13 21:04:20.741378 kernel: BIOS-e820: [mem 0x00000000fffe0000-0x00000000ffffffff] reserved Jan 13 21:04:20.741383 kernel: NX (Execute Disable) protection: active Jan 13 21:04:20.741388 kernel: APIC: Static calls initialized Jan 13 21:04:20.741393 kernel: SMBIOS 2.7 present. Jan 13 21:04:20.741400 kernel: DMI: VMware, Inc. VMware Virtual Platform/440BX Desktop Reference Platform, BIOS 6.00 05/28/2020 Jan 13 21:04:20.741407 kernel: vmware: hypercall mode: 0x00 Jan 13 21:04:20.741412 kernel: Hypervisor detected: VMware Jan 13 21:04:20.741417 kernel: vmware: TSC freq read from hypervisor : 3408.000 MHz Jan 13 21:04:20.741423 kernel: vmware: Host bus clock speed read from hypervisor : 66000000 Hz Jan 13 21:04:20.741428 kernel: vmware: using clock offset of 3707271205 ns Jan 13 21:04:20.741433 kernel: tsc: Detected 3408.000 MHz processor Jan 13 21:04:20.741438 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jan 13 21:04:20.741443 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jan 13 21:04:20.741448 kernel: last_pfn = 0x80000 max_arch_pfn = 0x400000000 Jan 13 21:04:20.741453 kernel: total RAM covered: 3072M Jan 13 21:04:20.741458 kernel: Found optimal setting for mtrr clean up Jan 13 21:04:20.741463 kernel: gran_size: 64K chunk_size: 64K num_reg: 2 lose cover RAM: 0G Jan 13 21:04:20.741469 kernel: MTRR map: 6 entries (5 fixed + 1 variable; max 21), built from 8 variable MTRRs Jan 13 21:04:20.741474 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jan 13 21:04:20.741479 kernel: Using GB pages for direct mapping Jan 13 21:04:20.741484 kernel: ACPI: Early table checksum verification disabled Jan 13 21:04:20.741489 kernel: ACPI: RSDP 0x00000000000F6A00 000024 (v02 PTLTD ) Jan 13 21:04:20.741494 kernel: ACPI: XSDT 0x000000007FEE965B 00005C (v01 INTEL 440BX 06040000 VMW 01324272) Jan 13 21:04:20.741499 kernel: ACPI: FACP 0x000000007FEFEE73 0000F4 (v04 INTEL 440BX 06040000 PTL 000F4240) Jan 13 21:04:20.741503 kernel: ACPI: DSDT 0x000000007FEEAD55 01411E (v01 PTLTD Custom 06040000 MSFT 03000001) Jan 13 21:04:20.741508 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 21:04:20.741516 kernel: ACPI: FACS 0x000000007FEFFFC0 000040 Jan 13 21:04:20.741521 kernel: ACPI: BOOT 0x000000007FEEAD2D 000028 (v01 PTLTD $SBFTBL$ 06040000 LTP 00000001) Jan 13 21:04:20.741526 kernel: ACPI: APIC 0x000000007FEEA5EB 000742 (v01 PTLTD ? APIC 06040000 LTP 00000000) Jan 13 21:04:20.741532 kernel: ACPI: MCFG 0x000000007FEEA5AF 00003C (v01 PTLTD $PCITBL$ 06040000 LTP 00000001) Jan 13 21:04:20.741537 kernel: ACPI: SRAT 0x000000007FEE9757 0008A8 (v02 VMWARE MEMPLUG 06040000 VMW 00000001) Jan 13 21:04:20.741543 kernel: ACPI: HPET 0x000000007FEE971F 000038 (v01 VMWARE VMW HPET 06040000 VMW 00000001) Jan 13 21:04:20.741548 kernel: ACPI: WAET 0x000000007FEE96F7 000028 (v01 VMWARE VMW WAET 06040000 VMW 00000001) Jan 13 21:04:20.741553 kernel: ACPI: Reserving FACP table memory at [mem 0x7fefee73-0x7fefef66] Jan 13 21:04:20.741558 kernel: ACPI: Reserving DSDT table memory at [mem 0x7feead55-0x7fefee72] Jan 13 21:04:20.741566 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 21:04:20.741571 kernel: ACPI: Reserving FACS table memory at [mem 0x7fefffc0-0x7fefffff] Jan 13 21:04:20.741576 kernel: ACPI: Reserving BOOT table memory at [mem 0x7feead2d-0x7feead54] Jan 13 21:04:20.741581 kernel: ACPI: Reserving APIC table memory at [mem 0x7feea5eb-0x7feead2c] Jan 13 21:04:20.741586 kernel: ACPI: Reserving MCFG table memory at [mem 0x7feea5af-0x7feea5ea] Jan 13 21:04:20.741591 kernel: ACPI: Reserving SRAT table memory at [mem 0x7fee9757-0x7fee9ffe] Jan 13 21:04:20.741598 kernel: ACPI: Reserving HPET table memory at [mem 0x7fee971f-0x7fee9756] Jan 13 21:04:20.741606 kernel: ACPI: Reserving WAET table memory at [mem 0x7fee96f7-0x7fee971e] Jan 13 21:04:20.741612 kernel: system APIC only can use physical flat Jan 13 21:04:20.741618 kernel: APIC: Switched APIC routing to: physical flat Jan 13 21:04:20.741623 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Jan 13 21:04:20.741628 kernel: SRAT: PXM 0 -> APIC 0x02 -> Node 0 Jan 13 21:04:20.741633 kernel: SRAT: PXM 0 -> APIC 0x04 -> Node 0 Jan 13 21:04:20.741638 kernel: SRAT: PXM 0 -> APIC 0x06 -> Node 0 Jan 13 21:04:20.741642 kernel: SRAT: PXM 0 -> APIC 0x08 -> Node 0 Jan 13 21:04:20.741650 kernel: SRAT: PXM 0 -> APIC 0x0a -> Node 0 Jan 13 21:04:20.741655 kernel: SRAT: PXM 0 -> APIC 0x0c -> Node 0 Jan 13 21:04:20.741660 kernel: SRAT: PXM 0 -> APIC 0x0e -> Node 0 Jan 13 21:04:20.741665 kernel: SRAT: PXM 0 -> APIC 0x10 -> Node 0 Jan 13 21:04:20.741669 kernel: SRAT: PXM 0 -> APIC 0x12 -> Node 0 Jan 13 21:04:20.741674 kernel: SRAT: PXM 0 -> APIC 0x14 -> Node 0 Jan 13 21:04:20.741679 kernel: SRAT: PXM 0 -> APIC 0x16 -> Node 0 Jan 13 21:04:20.741684 kernel: SRAT: PXM 0 -> APIC 0x18 -> Node 0 Jan 13 21:04:20.741689 kernel: SRAT: PXM 0 -> APIC 0x1a -> Node 0 Jan 13 21:04:20.741694 kernel: SRAT: PXM 0 -> APIC 0x1c -> Node 0 Jan 13 21:04:20.741701 kernel: SRAT: PXM 0 -> APIC 0x1e -> Node 0 Jan 13 21:04:20.741708 kernel: SRAT: PXM 0 -> APIC 0x20 -> Node 0 Jan 13 21:04:20.741716 kernel: SRAT: PXM 0 -> APIC 0x22 -> Node 0 Jan 13 21:04:20.741724 kernel: SRAT: PXM 0 -> APIC 0x24 -> Node 0 Jan 13 21:04:20.741729 kernel: SRAT: PXM 0 -> APIC 0x26 -> Node 0 Jan 13 21:04:20.741734 kernel: SRAT: PXM 0 -> APIC 0x28 -> Node 0 Jan 13 21:04:20.741739 kernel: SRAT: PXM 0 -> APIC 0x2a -> Node 0 Jan 13 21:04:20.741744 kernel: SRAT: PXM 0 -> APIC 0x2c -> Node 0 Jan 13 21:04:20.741749 kernel: SRAT: PXM 0 -> APIC 0x2e -> Node 0 Jan 13 21:04:20.741755 kernel: SRAT: PXM 0 -> APIC 0x30 -> Node 0 Jan 13 21:04:20.741763 kernel: SRAT: PXM 0 -> APIC 0x32 -> Node 0 Jan 13 21:04:20.741770 kernel: SRAT: PXM 0 -> APIC 0x34 -> Node 0 Jan 13 21:04:20.741775 kernel: SRAT: PXM 0 -> APIC 0x36 -> Node 0 Jan 13 21:04:20.741780 kernel: SRAT: PXM 0 -> APIC 0x38 -> Node 0 Jan 13 21:04:20.741785 kernel: SRAT: PXM 0 -> APIC 0x3a -> Node 0 Jan 13 21:04:20.741790 kernel: SRAT: PXM 0 -> APIC 0x3c -> Node 0 Jan 13 21:04:20.741795 kernel: SRAT: PXM 0 -> APIC 0x3e -> Node 0 Jan 13 21:04:20.741800 kernel: SRAT: PXM 0 -> APIC 0x40 -> Node 0 Jan 13 21:04:20.741805 kernel: SRAT: PXM 0 -> APIC 0x42 -> Node 0 Jan 13 21:04:20.741810 kernel: SRAT: PXM 0 -> APIC 0x44 -> Node 0 Jan 13 21:04:20.741814 kernel: SRAT: PXM 0 -> APIC 0x46 -> Node 0 Jan 13 21:04:20.741821 kernel: SRAT: PXM 0 -> APIC 0x48 -> Node 0 Jan 13 21:04:20.741826 kernel: SRAT: PXM 0 -> APIC 0x4a -> Node 0 Jan 13 21:04:20.741831 kernel: SRAT: PXM 0 -> APIC 0x4c -> Node 0 Jan 13 21:04:20.741835 kernel: SRAT: PXM 0 -> APIC 0x4e -> Node 0 Jan 13 21:04:20.741840 kernel: SRAT: PXM 0 -> APIC 0x50 -> Node 0 Jan 13 21:04:20.741845 kernel: SRAT: PXM 0 -> APIC 0x52 -> Node 0 Jan 13 21:04:20.741850 kernel: SRAT: PXM 0 -> APIC 0x54 -> Node 0 Jan 13 21:04:20.741855 kernel: SRAT: PXM 0 -> APIC 0x56 -> Node 0 Jan 13 21:04:20.741860 kernel: SRAT: PXM 0 -> APIC 0x58 -> Node 0 Jan 13 21:04:20.741865 kernel: SRAT: PXM 0 -> APIC 0x5a -> Node 0 Jan 13 21:04:20.741871 kernel: SRAT: PXM 0 -> APIC 0x5c -> Node 0 Jan 13 21:04:20.741877 kernel: SRAT: PXM 0 -> APIC 0x5e -> Node 0 Jan 13 21:04:20.741881 kernel: SRAT: PXM 0 -> APIC 0x60 -> Node 0 Jan 13 21:04:20.741886 kernel: SRAT: PXM 0 -> APIC 0x62 -> Node 0 Jan 13 21:04:20.741891 kernel: SRAT: PXM 0 -> APIC 0x64 -> Node 0 Jan 13 21:04:20.741896 kernel: SRAT: PXM 0 -> APIC 0x66 -> Node 0 Jan 13 21:04:20.741901 kernel: SRAT: PXM 0 -> APIC 0x68 -> Node 0 Jan 13 21:04:20.741906 kernel: SRAT: PXM 0 -> APIC 0x6a -> Node 0 Jan 13 21:04:20.741911 kernel: SRAT: PXM 0 -> APIC 0x6c -> Node 0 Jan 13 21:04:20.741916 kernel: SRAT: PXM 0 -> APIC 0x6e -> Node 0 Jan 13 21:04:20.741922 kernel: SRAT: PXM 0 -> APIC 0x70 -> Node 0 Jan 13 21:04:20.741927 kernel: SRAT: PXM 0 -> APIC 0x72 -> Node 0 Jan 13 21:04:20.741932 kernel: SRAT: PXM 0 -> APIC 0x74 -> Node 0 Jan 13 21:04:20.741941 kernel: SRAT: PXM 0 -> APIC 0x76 -> Node 0 Jan 13 21:04:20.741948 kernel: SRAT: PXM 0 -> APIC 0x78 -> Node 0 Jan 13 21:04:20.741953 kernel: SRAT: PXM 0 -> APIC 0x7a -> Node 0 Jan 13 21:04:20.741958 kernel: SRAT: PXM 0 -> APIC 0x7c -> Node 0 Jan 13 21:04:20.741964 kernel: SRAT: PXM 0 -> APIC 0x7e -> Node 0 Jan 13 21:04:20.741970 kernel: SRAT: PXM 0 -> APIC 0x80 -> Node 0 Jan 13 21:04:20.741975 kernel: SRAT: PXM 0 -> APIC 0x82 -> Node 0 Jan 13 21:04:20.741981 kernel: SRAT: PXM 0 -> APIC 0x84 -> Node 0 Jan 13 21:04:20.741986 kernel: SRAT: PXM 0 -> APIC 0x86 -> Node 0 Jan 13 21:04:20.741992 kernel: SRAT: PXM 0 -> APIC 0x88 -> Node 0 Jan 13 21:04:20.741997 kernel: SRAT: PXM 0 -> APIC 0x8a -> Node 0 Jan 13 21:04:20.742002 kernel: SRAT: PXM 0 -> APIC 0x8c -> Node 0 Jan 13 21:04:20.742007 kernel: SRAT: PXM 0 -> APIC 0x8e -> Node 0 Jan 13 21:04:20.742013 kernel: SRAT: PXM 0 -> APIC 0x90 -> Node 0 Jan 13 21:04:20.742018 kernel: SRAT: PXM 0 -> APIC 0x92 -> Node 0 Jan 13 21:04:20.742023 kernel: SRAT: PXM 0 -> APIC 0x94 -> Node 0 Jan 13 21:04:20.742030 kernel: SRAT: PXM 0 -> APIC 0x96 -> Node 0 Jan 13 21:04:20.742035 kernel: SRAT: PXM 0 -> APIC 0x98 -> Node 0 Jan 13 21:04:20.742040 kernel: SRAT: PXM 0 -> APIC 0x9a -> Node 0 Jan 13 21:04:20.742046 kernel: SRAT: PXM 0 -> APIC 0x9c -> Node 0 Jan 13 21:04:20.742051 kernel: SRAT: PXM 0 -> APIC 0x9e -> Node 0 Jan 13 21:04:20.742056 kernel: SRAT: PXM 0 -> APIC 0xa0 -> Node 0 Jan 13 21:04:20.742061 kernel: SRAT: PXM 0 -> APIC 0xa2 -> Node 0 Jan 13 21:04:20.742067 kernel: SRAT: PXM 0 -> APIC 0xa4 -> Node 0 Jan 13 21:04:20.742072 kernel: SRAT: PXM 0 -> APIC 0xa6 -> Node 0 Jan 13 21:04:20.742078 kernel: SRAT: PXM 0 -> APIC 0xa8 -> Node 0 Jan 13 21:04:20.742084 kernel: SRAT: PXM 0 -> APIC 0xaa -> Node 0 Jan 13 21:04:20.742089 kernel: SRAT: PXM 0 -> APIC 0xac -> Node 0 Jan 13 21:04:20.742094 kernel: SRAT: PXM 0 -> APIC 0xae -> Node 0 Jan 13 21:04:20.742100 kernel: SRAT: PXM 0 -> APIC 0xb0 -> Node 0 Jan 13 21:04:20.742105 kernel: SRAT: PXM 0 -> APIC 0xb2 -> Node 0 Jan 13 21:04:20.742110 kernel: SRAT: PXM 0 -> APIC 0xb4 -> Node 0 Jan 13 21:04:20.742116 kernel: SRAT: PXM 0 -> APIC 0xb6 -> Node 0 Jan 13 21:04:20.742124 kernel: SRAT: PXM 0 -> APIC 0xb8 -> Node 0 Jan 13 21:04:20.742133 kernel: SRAT: PXM 0 -> APIC 0xba -> Node 0 Jan 13 21:04:20.742140 kernel: SRAT: PXM 0 -> APIC 0xbc -> Node 0 Jan 13 21:04:20.742147 kernel: SRAT: PXM 0 -> APIC 0xbe -> Node 0 Jan 13 21:04:20.742152 kernel: SRAT: PXM 0 -> APIC 0xc0 -> Node 0 Jan 13 21:04:20.742158 kernel: SRAT: PXM 0 -> APIC 0xc2 -> Node 0 Jan 13 21:04:20.742163 kernel: SRAT: PXM 0 -> APIC 0xc4 -> Node 0 Jan 13 21:04:20.742168 kernel: SRAT: PXM 0 -> APIC 0xc6 -> Node 0 Jan 13 21:04:20.742174 kernel: SRAT: PXM 0 -> APIC 0xc8 -> Node 0 Jan 13 21:04:20.742179 kernel: SRAT: PXM 0 -> APIC 0xca -> Node 0 Jan 13 21:04:20.742540 kernel: SRAT: PXM 0 -> APIC 0xcc -> Node 0 Jan 13 21:04:20.742546 kernel: SRAT: PXM 0 -> APIC 0xce -> Node 0 Jan 13 21:04:20.742552 kernel: SRAT: PXM 0 -> APIC 0xd0 -> Node 0 Jan 13 21:04:20.742560 kernel: SRAT: PXM 0 -> APIC 0xd2 -> Node 0 Jan 13 21:04:20.742565 kernel: SRAT: PXM 0 -> APIC 0xd4 -> Node 0 Jan 13 21:04:20.742571 kernel: SRAT: PXM 0 -> APIC 0xd6 -> Node 0 Jan 13 21:04:20.742576 kernel: SRAT: PXM 0 -> APIC 0xd8 -> Node 0 Jan 13 21:04:20.742581 kernel: SRAT: PXM 0 -> APIC 0xda -> Node 0 Jan 13 21:04:20.742586 kernel: SRAT: PXM 0 -> APIC 0xdc -> Node 0 Jan 13 21:04:20.742592 kernel: SRAT: PXM 0 -> APIC 0xde -> Node 0 Jan 13 21:04:20.742597 kernel: SRAT: PXM 0 -> APIC 0xe0 -> Node 0 Jan 13 21:04:20.742602 kernel: SRAT: PXM 0 -> APIC 0xe2 -> Node 0 Jan 13 21:04:20.742607 kernel: SRAT: PXM 0 -> APIC 0xe4 -> Node 0 Jan 13 21:04:20.742614 kernel: SRAT: PXM 0 -> APIC 0xe6 -> Node 0 Jan 13 21:04:20.742619 kernel: SRAT: PXM 0 -> APIC 0xe8 -> Node 0 Jan 13 21:04:20.742624 kernel: SRAT: PXM 0 -> APIC 0xea -> Node 0 Jan 13 21:04:20.742630 kernel: SRAT: PXM 0 -> APIC 0xec -> Node 0 Jan 13 21:04:20.742635 kernel: SRAT: PXM 0 -> APIC 0xee -> Node 0 Jan 13 21:04:20.742641 kernel: SRAT: PXM 0 -> APIC 0xf0 -> Node 0 Jan 13 21:04:20.742646 kernel: SRAT: PXM 0 -> APIC 0xf2 -> Node 0 Jan 13 21:04:20.742651 kernel: SRAT: PXM 0 -> APIC 0xf4 -> Node 0 Jan 13 21:04:20.742656 kernel: SRAT: PXM 0 -> APIC 0xf6 -> Node 0 Jan 13 21:04:20.742662 kernel: SRAT: PXM 0 -> APIC 0xf8 -> Node 0 Jan 13 21:04:20.742668 kernel: SRAT: PXM 0 -> APIC 0xfa -> Node 0 Jan 13 21:04:20.742673 kernel: SRAT: PXM 0 -> APIC 0xfc -> Node 0 Jan 13 21:04:20.742679 kernel: SRAT: PXM 0 -> APIC 0xfe -> Node 0 Jan 13 21:04:20.742684 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x0009ffff] Jan 13 21:04:20.742690 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00100000-0x7fffffff] Jan 13 21:04:20.742696 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000-0xbfffffff] hotplug Jan 13 21:04:20.742704 kernel: NUMA: Node 0 [mem 0x00000000-0x0009ffff] + [mem 0x00100000-0x7fffffff] -> [mem 0x00000000-0x7fffffff] Jan 13 21:04:20.742710 kernel: NODE_DATA(0) allocated [mem 0x7fffa000-0x7fffffff] Jan 13 21:04:20.742716 kernel: Zone ranges: Jan 13 21:04:20.742721 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jan 13 21:04:20.742728 kernel: DMA32 [mem 0x0000000001000000-0x000000007fffffff] Jan 13 21:04:20.742734 kernel: Normal empty Jan 13 21:04:20.742739 kernel: Movable zone start for each node Jan 13 21:04:20.742745 kernel: Early memory node ranges Jan 13 21:04:20.742750 kernel: node 0: [mem 0x0000000000001000-0x000000000009dfff] Jan 13 21:04:20.742756 kernel: node 0: [mem 0x0000000000100000-0x000000007fedffff] Jan 13 21:04:20.742761 kernel: node 0: [mem 0x000000007ff00000-0x000000007fffffff] Jan 13 21:04:20.742766 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007fffffff] Jan 13 21:04:20.742772 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jan 13 21:04:20.742778 kernel: On node 0, zone DMA: 98 pages in unavailable ranges Jan 13 21:04:20.742784 kernel: On node 0, zone DMA32: 32 pages in unavailable ranges Jan 13 21:04:20.742789 kernel: ACPI: PM-Timer IO Port: 0x1008 Jan 13 21:04:20.742795 kernel: system APIC only can use physical flat Jan 13 21:04:20.742800 kernel: ACPI: LAPIC_NMI (acpi_id[0x00] high edge lint[0x1]) Jan 13 21:04:20.742806 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] high edge lint[0x1]) Jan 13 21:04:20.742814 kernel: ACPI: LAPIC_NMI (acpi_id[0x02] high edge lint[0x1]) Jan 13 21:04:20.742819 kernel: ACPI: LAPIC_NMI (acpi_id[0x03] high edge lint[0x1]) Jan 13 21:04:20.742825 kernel: ACPI: LAPIC_NMI (acpi_id[0x04] high edge lint[0x1]) Jan 13 21:04:20.742830 kernel: ACPI: LAPIC_NMI (acpi_id[0x05] high edge lint[0x1]) Jan 13 21:04:20.742837 kernel: ACPI: LAPIC_NMI (acpi_id[0x06] high edge lint[0x1]) Jan 13 21:04:20.742842 kernel: ACPI: LAPIC_NMI (acpi_id[0x07] high edge lint[0x1]) Jan 13 21:04:20.742848 kernel: ACPI: LAPIC_NMI (acpi_id[0x08] high edge lint[0x1]) Jan 13 21:04:20.742853 kernel: ACPI: LAPIC_NMI (acpi_id[0x09] high edge lint[0x1]) Jan 13 21:04:20.742862 kernel: ACPI: LAPIC_NMI (acpi_id[0x0a] high edge lint[0x1]) Jan 13 21:04:20.742868 kernel: ACPI: LAPIC_NMI (acpi_id[0x0b] high edge lint[0x1]) Jan 13 21:04:20.742874 kernel: ACPI: LAPIC_NMI (acpi_id[0x0c] high edge lint[0x1]) Jan 13 21:04:20.742879 kernel: ACPI: LAPIC_NMI (acpi_id[0x0d] high edge lint[0x1]) Jan 13 21:04:20.742884 kernel: ACPI: LAPIC_NMI (acpi_id[0x0e] high edge lint[0x1]) Jan 13 21:04:20.742891 kernel: ACPI: LAPIC_NMI (acpi_id[0x0f] high edge lint[0x1]) Jan 13 21:04:20.742897 kernel: ACPI: LAPIC_NMI (acpi_id[0x10] high edge lint[0x1]) Jan 13 21:04:20.742902 kernel: ACPI: LAPIC_NMI (acpi_id[0x11] high edge lint[0x1]) Jan 13 21:04:20.742907 kernel: ACPI: LAPIC_NMI (acpi_id[0x12] high edge lint[0x1]) Jan 13 21:04:20.742913 kernel: ACPI: LAPIC_NMI (acpi_id[0x13] high edge lint[0x1]) Jan 13 21:04:20.742918 kernel: ACPI: LAPIC_NMI (acpi_id[0x14] high edge lint[0x1]) Jan 13 21:04:20.742923 kernel: ACPI: LAPIC_NMI (acpi_id[0x15] high edge lint[0x1]) Jan 13 21:04:20.742929 kernel: ACPI: LAPIC_NMI (acpi_id[0x16] high edge lint[0x1]) Jan 13 21:04:20.742934 kernel: ACPI: LAPIC_NMI (acpi_id[0x17] high edge lint[0x1]) Jan 13 21:04:20.742939 kernel: ACPI: LAPIC_NMI (acpi_id[0x18] high edge lint[0x1]) Jan 13 21:04:20.742946 kernel: ACPI: LAPIC_NMI (acpi_id[0x19] high edge lint[0x1]) Jan 13 21:04:20.742952 kernel: ACPI: LAPIC_NMI (acpi_id[0x1a] high edge lint[0x1]) Jan 13 21:04:20.742957 kernel: ACPI: LAPIC_NMI (acpi_id[0x1b] high edge lint[0x1]) Jan 13 21:04:20.742962 kernel: ACPI: LAPIC_NMI (acpi_id[0x1c] high edge lint[0x1]) Jan 13 21:04:20.742968 kernel: ACPI: LAPIC_NMI (acpi_id[0x1d] high edge lint[0x1]) Jan 13 21:04:20.742973 kernel: ACPI: LAPIC_NMI (acpi_id[0x1e] high edge lint[0x1]) Jan 13 21:04:20.742978 kernel: ACPI: LAPIC_NMI (acpi_id[0x1f] high edge lint[0x1]) Jan 13 21:04:20.742983 kernel: ACPI: LAPIC_NMI (acpi_id[0x20] high edge lint[0x1]) Jan 13 21:04:20.742989 kernel: ACPI: LAPIC_NMI (acpi_id[0x21] high edge lint[0x1]) Jan 13 21:04:20.742994 kernel: ACPI: LAPIC_NMI (acpi_id[0x22] high edge lint[0x1]) Jan 13 21:04:20.743001 kernel: ACPI: LAPIC_NMI (acpi_id[0x23] high edge lint[0x1]) Jan 13 21:04:20.743006 kernel: ACPI: LAPIC_NMI (acpi_id[0x24] high edge lint[0x1]) Jan 13 21:04:20.743012 kernel: ACPI: LAPIC_NMI (acpi_id[0x25] high edge lint[0x1]) Jan 13 21:04:20.743017 kernel: ACPI: LAPIC_NMI (acpi_id[0x26] high edge lint[0x1]) Jan 13 21:04:20.743022 kernel: ACPI: LAPIC_NMI (acpi_id[0x27] high edge lint[0x1]) Jan 13 21:04:20.743028 kernel: ACPI: LAPIC_NMI (acpi_id[0x28] high edge lint[0x1]) Jan 13 21:04:20.743033 kernel: ACPI: LAPIC_NMI (acpi_id[0x29] high edge lint[0x1]) Jan 13 21:04:20.743038 kernel: ACPI: LAPIC_NMI (acpi_id[0x2a] high edge lint[0x1]) Jan 13 21:04:20.743043 kernel: ACPI: LAPIC_NMI (acpi_id[0x2b] high edge lint[0x1]) Jan 13 21:04:20.743050 kernel: ACPI: LAPIC_NMI (acpi_id[0x2c] high edge lint[0x1]) Jan 13 21:04:20.743055 kernel: ACPI: LAPIC_NMI (acpi_id[0x2d] high edge lint[0x1]) Jan 13 21:04:20.743060 kernel: ACPI: LAPIC_NMI (acpi_id[0x2e] high edge lint[0x1]) Jan 13 21:04:20.743066 kernel: ACPI: LAPIC_NMI (acpi_id[0x2f] high edge lint[0x1]) Jan 13 21:04:20.743071 kernel: ACPI: LAPIC_NMI (acpi_id[0x30] high edge lint[0x1]) Jan 13 21:04:20.743076 kernel: ACPI: LAPIC_NMI (acpi_id[0x31] high edge lint[0x1]) Jan 13 21:04:20.743082 kernel: ACPI: LAPIC_NMI (acpi_id[0x32] high edge lint[0x1]) Jan 13 21:04:20.743087 kernel: ACPI: LAPIC_NMI (acpi_id[0x33] high edge lint[0x1]) Jan 13 21:04:20.743092 kernel: ACPI: LAPIC_NMI (acpi_id[0x34] high edge lint[0x1]) Jan 13 21:04:20.743098 kernel: ACPI: LAPIC_NMI (acpi_id[0x35] high edge lint[0x1]) Jan 13 21:04:20.743104 kernel: ACPI: LAPIC_NMI (acpi_id[0x36] high edge lint[0x1]) Jan 13 21:04:20.743109 kernel: ACPI: LAPIC_NMI (acpi_id[0x37] high edge lint[0x1]) Jan 13 21:04:20.743115 kernel: ACPI: LAPIC_NMI (acpi_id[0x38] high edge lint[0x1]) Jan 13 21:04:20.743120 kernel: ACPI: LAPIC_NMI (acpi_id[0x39] high edge lint[0x1]) Jan 13 21:04:20.743126 kernel: ACPI: LAPIC_NMI (acpi_id[0x3a] high edge lint[0x1]) Jan 13 21:04:20.743131 kernel: ACPI: LAPIC_NMI (acpi_id[0x3b] high edge lint[0x1]) Jan 13 21:04:20.743136 kernel: ACPI: LAPIC_NMI (acpi_id[0x3c] high edge lint[0x1]) Jan 13 21:04:20.743141 kernel: ACPI: LAPIC_NMI (acpi_id[0x3d] high edge lint[0x1]) Jan 13 21:04:20.743147 kernel: ACPI: LAPIC_NMI (acpi_id[0x3e] high edge lint[0x1]) Jan 13 21:04:20.743152 kernel: ACPI: LAPIC_NMI (acpi_id[0x3f] high edge lint[0x1]) Jan 13 21:04:20.743158 kernel: ACPI: LAPIC_NMI (acpi_id[0x40] high edge lint[0x1]) Jan 13 21:04:20.743164 kernel: ACPI: LAPIC_NMI (acpi_id[0x41] high edge lint[0x1]) Jan 13 21:04:20.743169 kernel: ACPI: LAPIC_NMI (acpi_id[0x42] high edge lint[0x1]) Jan 13 21:04:20.743175 kernel: ACPI: LAPIC_NMI (acpi_id[0x43] high edge lint[0x1]) Jan 13 21:04:20.743180 kernel: ACPI: LAPIC_NMI (acpi_id[0x44] high edge lint[0x1]) Jan 13 21:04:20.743192 kernel: ACPI: LAPIC_NMI (acpi_id[0x45] high edge lint[0x1]) Jan 13 21:04:20.743198 kernel: ACPI: LAPIC_NMI (acpi_id[0x46] high edge lint[0x1]) Jan 13 21:04:20.743203 kernel: ACPI: LAPIC_NMI (acpi_id[0x47] high edge lint[0x1]) Jan 13 21:04:20.743208 kernel: ACPI: LAPIC_NMI (acpi_id[0x48] high edge lint[0x1]) Jan 13 21:04:20.743215 kernel: ACPI: LAPIC_NMI (acpi_id[0x49] high edge lint[0x1]) Jan 13 21:04:20.743225 kernel: ACPI: LAPIC_NMI (acpi_id[0x4a] high edge lint[0x1]) Jan 13 21:04:20.743230 kernel: ACPI: LAPIC_NMI (acpi_id[0x4b] high edge lint[0x1]) Jan 13 21:04:20.743238 kernel: ACPI: LAPIC_NMI (acpi_id[0x4c] high edge lint[0x1]) Jan 13 21:04:20.743244 kernel: ACPI: LAPIC_NMI (acpi_id[0x4d] high edge lint[0x1]) Jan 13 21:04:20.743249 kernel: ACPI: LAPIC_NMI (acpi_id[0x4e] high edge lint[0x1]) Jan 13 21:04:20.743255 kernel: ACPI: LAPIC_NMI (acpi_id[0x4f] high edge lint[0x1]) Jan 13 21:04:20.743260 kernel: ACPI: LAPIC_NMI (acpi_id[0x50] high edge lint[0x1]) Jan 13 21:04:20.743266 kernel: ACPI: LAPIC_NMI (acpi_id[0x51] high edge lint[0x1]) Jan 13 21:04:20.743271 kernel: ACPI: LAPIC_NMI (acpi_id[0x52] high edge lint[0x1]) Jan 13 21:04:20.743278 kernel: ACPI: LAPIC_NMI (acpi_id[0x53] high edge lint[0x1]) Jan 13 21:04:20.743283 kernel: ACPI: LAPIC_NMI (acpi_id[0x54] high edge lint[0x1]) Jan 13 21:04:20.743289 kernel: ACPI: LAPIC_NMI (acpi_id[0x55] high edge lint[0x1]) Jan 13 21:04:20.743294 kernel: ACPI: LAPIC_NMI (acpi_id[0x56] high edge lint[0x1]) Jan 13 21:04:20.743300 kernel: ACPI: LAPIC_NMI (acpi_id[0x57] high edge lint[0x1]) Jan 13 21:04:20.743305 kernel: ACPI: LAPIC_NMI (acpi_id[0x58] high edge lint[0x1]) Jan 13 21:04:20.743310 kernel: ACPI: LAPIC_NMI (acpi_id[0x59] high edge lint[0x1]) Jan 13 21:04:20.743315 kernel: ACPI: LAPIC_NMI (acpi_id[0x5a] high edge lint[0x1]) Jan 13 21:04:20.743321 kernel: ACPI: LAPIC_NMI (acpi_id[0x5b] high edge lint[0x1]) Jan 13 21:04:20.743327 kernel: ACPI: LAPIC_NMI (acpi_id[0x5c] high edge lint[0x1]) Jan 13 21:04:20.743333 kernel: ACPI: LAPIC_NMI (acpi_id[0x5d] high edge lint[0x1]) Jan 13 21:04:20.743338 kernel: ACPI: LAPIC_NMI (acpi_id[0x5e] high edge lint[0x1]) Jan 13 21:04:20.743343 kernel: ACPI: LAPIC_NMI (acpi_id[0x5f] high edge lint[0x1]) Jan 13 21:04:20.743349 kernel: ACPI: LAPIC_NMI (acpi_id[0x60] high edge lint[0x1]) Jan 13 21:04:20.743354 kernel: ACPI: LAPIC_NMI (acpi_id[0x61] high edge lint[0x1]) Jan 13 21:04:20.743359 kernel: ACPI: LAPIC_NMI (acpi_id[0x62] high edge lint[0x1]) Jan 13 21:04:20.743365 kernel: ACPI: LAPIC_NMI (acpi_id[0x63] high edge lint[0x1]) Jan 13 21:04:20.743370 kernel: ACPI: LAPIC_NMI (acpi_id[0x64] high edge lint[0x1]) Jan 13 21:04:20.743375 kernel: ACPI: LAPIC_NMI (acpi_id[0x65] high edge lint[0x1]) Jan 13 21:04:20.743382 kernel: ACPI: LAPIC_NMI (acpi_id[0x66] high edge lint[0x1]) Jan 13 21:04:20.743388 kernel: ACPI: LAPIC_NMI (acpi_id[0x67] high edge lint[0x1]) Jan 13 21:04:20.743395 kernel: ACPI: LAPIC_NMI (acpi_id[0x68] high edge lint[0x1]) Jan 13 21:04:20.743400 kernel: ACPI: LAPIC_NMI (acpi_id[0x69] high edge lint[0x1]) Jan 13 21:04:20.743409 kernel: ACPI: LAPIC_NMI (acpi_id[0x6a] high edge lint[0x1]) Jan 13 21:04:20.743418 kernel: ACPI: LAPIC_NMI (acpi_id[0x6b] high edge lint[0x1]) Jan 13 21:04:20.743425 kernel: ACPI: LAPIC_NMI (acpi_id[0x6c] high edge lint[0x1]) Jan 13 21:04:20.743430 kernel: ACPI: LAPIC_NMI (acpi_id[0x6d] high edge lint[0x1]) Jan 13 21:04:20.743436 kernel: ACPI: LAPIC_NMI (acpi_id[0x6e] high edge lint[0x1]) Jan 13 21:04:20.743441 kernel: ACPI: LAPIC_NMI (acpi_id[0x6f] high edge lint[0x1]) Jan 13 21:04:20.743448 kernel: ACPI: LAPIC_NMI (acpi_id[0x70] high edge lint[0x1]) Jan 13 21:04:20.743454 kernel: ACPI: LAPIC_NMI (acpi_id[0x71] high edge lint[0x1]) Jan 13 21:04:20.743459 kernel: ACPI: LAPIC_NMI (acpi_id[0x72] high edge lint[0x1]) Jan 13 21:04:20.743464 kernel: ACPI: LAPIC_NMI (acpi_id[0x73] high edge lint[0x1]) Jan 13 21:04:20.743470 kernel: ACPI: LAPIC_NMI (acpi_id[0x74] high edge lint[0x1]) Jan 13 21:04:20.743475 kernel: ACPI: LAPIC_NMI (acpi_id[0x75] high edge lint[0x1]) Jan 13 21:04:20.743480 kernel: ACPI: LAPIC_NMI (acpi_id[0x76] high edge lint[0x1]) Jan 13 21:04:20.743486 kernel: ACPI: LAPIC_NMI (acpi_id[0x77] high edge lint[0x1]) Jan 13 21:04:20.743491 kernel: ACPI: LAPIC_NMI (acpi_id[0x78] high edge lint[0x1]) Jan 13 21:04:20.743497 kernel: ACPI: LAPIC_NMI (acpi_id[0x79] high edge lint[0x1]) Jan 13 21:04:20.743503 kernel: ACPI: LAPIC_NMI (acpi_id[0x7a] high edge lint[0x1]) Jan 13 21:04:20.743508 kernel: ACPI: LAPIC_NMI (acpi_id[0x7b] high edge lint[0x1]) Jan 13 21:04:20.743513 kernel: ACPI: LAPIC_NMI (acpi_id[0x7c] high edge lint[0x1]) Jan 13 21:04:20.743519 kernel: ACPI: LAPIC_NMI (acpi_id[0x7d] high edge lint[0x1]) Jan 13 21:04:20.743524 kernel: ACPI: LAPIC_NMI (acpi_id[0x7e] high edge lint[0x1]) Jan 13 21:04:20.743529 kernel: ACPI: LAPIC_NMI (acpi_id[0x7f] high edge lint[0x1]) Jan 13 21:04:20.743534 kernel: IOAPIC[0]: apic_id 1, version 17, address 0xfec00000, GSI 0-23 Jan 13 21:04:20.743540 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 high edge) Jan 13 21:04:20.743545 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jan 13 21:04:20.743552 kernel: ACPI: HPET id: 0x8086af01 base: 0xfed00000 Jan 13 21:04:20.743558 kernel: TSC deadline timer available Jan 13 21:04:20.743563 kernel: smpboot: Allowing 128 CPUs, 126 hotplug CPUs Jan 13 21:04:20.743569 kernel: [mem 0x80000000-0xefffffff] available for PCI devices Jan 13 21:04:20.743574 kernel: Booting paravirtualized kernel on VMware hypervisor Jan 13 21:04:20.743580 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jan 13 21:04:20.743585 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:128 nr_cpu_ids:128 nr_node_ids:1 Jan 13 21:04:20.743591 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u262144 Jan 13 21:04:20.743596 kernel: pcpu-alloc: s197032 r8192 d32344 u262144 alloc=1*2097152 Jan 13 21:04:20.743603 kernel: pcpu-alloc: [0] 000 001 002 003 004 005 006 007 Jan 13 21:04:20.743608 kernel: pcpu-alloc: [0] 008 009 010 011 012 013 014 015 Jan 13 21:04:20.743614 kernel: pcpu-alloc: [0] 016 017 018 019 020 021 022 023 Jan 13 21:04:20.743619 kernel: pcpu-alloc: [0] 024 025 026 027 028 029 030 031 Jan 13 21:04:20.743624 kernel: pcpu-alloc: [0] 032 033 034 035 036 037 038 039 Jan 13 21:04:20.743638 kernel: pcpu-alloc: [0] 040 041 042 043 044 045 046 047 Jan 13 21:04:20.743645 kernel: pcpu-alloc: [0] 048 049 050 051 052 053 054 055 Jan 13 21:04:20.743650 kernel: pcpu-alloc: [0] 056 057 058 059 060 061 062 063 Jan 13 21:04:20.743656 kernel: pcpu-alloc: [0] 064 065 066 067 068 069 070 071 Jan 13 21:04:20.743663 kernel: pcpu-alloc: [0] 072 073 074 075 076 077 078 079 Jan 13 21:04:20.743668 kernel: pcpu-alloc: [0] 080 081 082 083 084 085 086 087 Jan 13 21:04:20.743674 kernel: pcpu-alloc: [0] 088 089 090 091 092 093 094 095 Jan 13 21:04:20.743680 kernel: pcpu-alloc: [0] 096 097 098 099 100 101 102 103 Jan 13 21:04:20.743685 kernel: pcpu-alloc: [0] 104 105 106 107 108 109 110 111 Jan 13 21:04:20.743691 kernel: pcpu-alloc: [0] 112 113 114 115 116 117 118 119 Jan 13 21:04:20.743697 kernel: pcpu-alloc: [0] 120 121 122 123 124 125 126 127 Jan 13 21:04:20.743703 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 21:04:20.743711 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jan 13 21:04:20.743716 kernel: random: crng init done Jan 13 21:04:20.743722 kernel: printk: log_buf_len individual max cpu contribution: 4096 bytes Jan 13 21:04:20.743728 kernel: printk: log_buf_len total cpu_extra contributions: 520192 bytes Jan 13 21:04:20.743734 kernel: printk: log_buf_len min size: 262144 bytes Jan 13 21:04:20.743739 kernel: printk: log_buf_len: 1048576 bytes Jan 13 21:04:20.743745 kernel: printk: early log buf free: 239648(91%) Jan 13 21:04:20.743751 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 13 21:04:20.743760 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jan 13 21:04:20.743771 kernel: Fallback order for Node 0: 0 Jan 13 21:04:20.743781 kernel: Built 1 zonelists, mobility grouping on. Total pages: 515808 Jan 13 21:04:20.743787 kernel: Policy zone: DMA32 Jan 13 21:04:20.743793 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 13 21:04:20.743799 kernel: Memory: 1936348K/2096628K available (12288K kernel code, 2299K rwdata, 22736K rodata, 42976K init, 2216K bss, 160020K reserved, 0K cma-reserved) Jan 13 21:04:20.743808 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=128, Nodes=1 Jan 13 21:04:20.743813 kernel: ftrace: allocating 37920 entries in 149 pages Jan 13 21:04:20.743819 kernel: ftrace: allocated 149 pages with 4 groups Jan 13 21:04:20.743825 kernel: Dynamic Preempt: voluntary Jan 13 21:04:20.743831 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 13 21:04:20.743837 kernel: rcu: RCU event tracing is enabled. Jan 13 21:04:20.743843 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=128. Jan 13 21:04:20.743849 kernel: Trampoline variant of Tasks RCU enabled. Jan 13 21:04:20.743855 kernel: Rude variant of Tasks RCU enabled. Jan 13 21:04:20.743862 kernel: Tracing variant of Tasks RCU enabled. Jan 13 21:04:20.743868 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 13 21:04:20.743873 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=128 Jan 13 21:04:20.743879 kernel: NR_IRQS: 33024, nr_irqs: 1448, preallocated irqs: 16 Jan 13 21:04:20.743885 kernel: rcu: srcu_init: Setting srcu_struct sizes to big. Jan 13 21:04:20.743891 kernel: Console: colour VGA+ 80x25 Jan 13 21:04:20.743896 kernel: printk: console [tty0] enabled Jan 13 21:04:20.743902 kernel: printk: console [ttyS0] enabled Jan 13 21:04:20.743908 kernel: ACPI: Core revision 20230628 Jan 13 21:04:20.743914 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 133484882848 ns Jan 13 21:04:20.743921 kernel: APIC: Switch to symmetric I/O mode setup Jan 13 21:04:20.743928 kernel: x2apic enabled Jan 13 21:04:20.743936 kernel: APIC: Switched APIC routing to: physical x2apic Jan 13 21:04:20.743942 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jan 13 21:04:20.743948 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 21:04:20.743954 kernel: Calibrating delay loop (skipped) preset value.. 6816.00 BogoMIPS (lpj=3408000) Jan 13 21:04:20.743960 kernel: Disabled fast string operations Jan 13 21:04:20.743966 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Jan 13 21:04:20.743975 kernel: Last level dTLB entries: 4KB 64, 2MB 0, 4MB 0, 1GB 4 Jan 13 21:04:20.743983 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jan 13 21:04:20.743989 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on vm exit Jan 13 21:04:20.743995 kernel: Spectre V2 : Spectre BHI mitigation: SW BHB clearing on syscall Jan 13 21:04:20.744001 kernel: Spectre V2 : Mitigation: Enhanced / Automatic IBRS Jan 13 21:04:20.744007 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Jan 13 21:04:20.744013 kernel: Spectre V2 : Spectre v2 / PBRSB-eIBRS: Retire a single CALL on VMEXIT Jan 13 21:04:20.744018 kernel: RETBleed: Mitigation: Enhanced IBRS Jan 13 21:04:20.744024 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jan 13 21:04:20.744031 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jan 13 21:04:20.744037 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Jan 13 21:04:20.744043 kernel: SRBDS: Unknown: Dependent on hypervisor status Jan 13 21:04:20.744049 kernel: GDS: Unknown: Dependent on hypervisor status Jan 13 21:04:20.744055 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jan 13 21:04:20.744061 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jan 13 21:04:20.744066 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jan 13 21:04:20.744072 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jan 13 21:04:20.744078 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jan 13 21:04:20.744085 kernel: Freeing SMP alternatives memory: 32K Jan 13 21:04:20.744091 kernel: pid_max: default: 131072 minimum: 1024 Jan 13 21:04:20.744097 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity Jan 13 21:04:20.744102 kernel: landlock: Up and running. Jan 13 21:04:20.744108 kernel: SELinux: Initializing. Jan 13 21:04:20.744114 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.744120 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.744126 kernel: smpboot: CPU0: Intel(R) Xeon(R) E-2278G CPU @ 3.40GHz (family: 0x6, model: 0x9e, stepping: 0xd) Jan 13 21:04:20.744132 kernel: RCU Tasks: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 21:04:20.744139 kernel: RCU Tasks Rude: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 21:04:20.744145 kernel: RCU Tasks Trace: Setting shift to 7 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=128. Jan 13 21:04:20.744151 kernel: Performance Events: Skylake events, core PMU driver. Jan 13 21:04:20.744157 kernel: core: CPUID marked event: 'cpu cycles' unavailable Jan 13 21:04:20.744163 kernel: core: CPUID marked event: 'instructions' unavailable Jan 13 21:04:20.744168 kernel: core: CPUID marked event: 'bus cycles' unavailable Jan 13 21:04:20.744175 kernel: core: CPUID marked event: 'cache references' unavailable Jan 13 21:04:20.744180 kernel: core: CPUID marked event: 'cache misses' unavailable Jan 13 21:04:20.744210 kernel: core: CPUID marked event: 'branch instructions' unavailable Jan 13 21:04:20.744218 kernel: core: CPUID marked event: 'branch misses' unavailable Jan 13 21:04:20.744223 kernel: ... version: 1 Jan 13 21:04:20.744229 kernel: ... bit width: 48 Jan 13 21:04:20.744235 kernel: ... generic registers: 4 Jan 13 21:04:20.744241 kernel: ... value mask: 0000ffffffffffff Jan 13 21:04:20.744246 kernel: ... max period: 000000007fffffff Jan 13 21:04:20.744252 kernel: ... fixed-purpose events: 0 Jan 13 21:04:20.744258 kernel: ... event mask: 000000000000000f Jan 13 21:04:20.744264 kernel: signal: max sigframe size: 1776 Jan 13 21:04:20.744271 kernel: rcu: Hierarchical SRCU implementation. Jan 13 21:04:20.744277 kernel: rcu: Max phase no-delay instances is 400. Jan 13 21:04:20.744283 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Jan 13 21:04:20.744291 kernel: smp: Bringing up secondary CPUs ... Jan 13 21:04:20.744297 kernel: smpboot: x86: Booting SMP configuration: Jan 13 21:04:20.744303 kernel: .... node #0, CPUs: #1 Jan 13 21:04:20.744309 kernel: Disabled fast string operations Jan 13 21:04:20.744314 kernel: smpboot: CPU 1 Converting physical 2 to logical package 1 Jan 13 21:04:20.744320 kernel: smpboot: CPU 1 Converting physical 0 to logical die 1 Jan 13 21:04:20.744331 kernel: smp: Brought up 1 node, 2 CPUs Jan 13 21:04:20.744337 kernel: smpboot: Max logical packages: 128 Jan 13 21:04:20.744343 kernel: smpboot: Total of 2 processors activated (13632.00 BogoMIPS) Jan 13 21:04:20.744348 kernel: devtmpfs: initialized Jan 13 21:04:20.744354 kernel: x86/mm: Memory block size: 128MB Jan 13 21:04:20.744360 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x7feff000-0x7fefffff] (4096 bytes) Jan 13 21:04:20.744366 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 13 21:04:20.744373 kernel: futex hash table entries: 32768 (order: 9, 2097152 bytes, linear) Jan 13 21:04:20.744379 kernel: pinctrl core: initialized pinctrl subsystem Jan 13 21:04:20.744386 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 13 21:04:20.744392 kernel: audit: initializing netlink subsys (disabled) Jan 13 21:04:20.744398 kernel: audit: type=2000 audit(1736802259.067:1): state=initialized audit_enabled=0 res=1 Jan 13 21:04:20.744404 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 13 21:04:20.744409 kernel: thermal_sys: Registered thermal governor 'user_space' Jan 13 21:04:20.744415 kernel: cpuidle: using governor menu Jan 13 21:04:20.744421 kernel: Simple Boot Flag at 0x36 set to 0x80 Jan 13 21:04:20.744427 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 13 21:04:20.744433 kernel: dca service started, version 1.12.1 Jan 13 21:04:20.744439 kernel: PCI: MMCONFIG for domain 0000 [bus 00-7f] at [mem 0xf0000000-0xf7ffffff] (base 0xf0000000) Jan 13 21:04:20.744445 kernel: PCI: Using configuration type 1 for base access Jan 13 21:04:20.744451 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jan 13 21:04:20.744457 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 13 21:04:20.744463 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jan 13 21:04:20.744469 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 13 21:04:20.744475 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jan 13 21:04:20.744480 kernel: ACPI: Added _OSI(Module Device) Jan 13 21:04:20.744486 kernel: ACPI: Added _OSI(Processor Device) Jan 13 21:04:20.744495 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Jan 13 21:04:20.744505 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 13 21:04:20.744514 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 13 21:04:20.744520 kernel: ACPI: [Firmware Bug]: BIOS _OSI(Linux) query ignored Jan 13 21:04:20.744526 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC Jan 13 21:04:20.744532 kernel: ACPI: Interpreter enabled Jan 13 21:04:20.744538 kernel: ACPI: PM: (supports S0 S1 S5) Jan 13 21:04:20.744544 kernel: ACPI: Using IOAPIC for interrupt routing Jan 13 21:04:20.744549 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jan 13 21:04:20.744557 kernel: PCI: Using E820 reservations for host bridge windows Jan 13 21:04:20.744563 kernel: ACPI: Enabled 4 GPEs in block 00 to 0F Jan 13 21:04:20.744569 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-7f]) Jan 13 21:04:20.744648 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 13 21:04:20.744705 kernel: acpi PNP0A03:00: _OSC: platform does not support [AER LTR] Jan 13 21:04:20.744755 kernel: acpi PNP0A03:00: _OSC: OS now controls [PCIeHotplug PME PCIeCapability] Jan 13 21:04:20.744764 kernel: PCI host bridge to bus 0000:00 Jan 13 21:04:20.744814 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jan 13 21:04:20.744872 kernel: pci_bus 0000:00: root bus resource [mem 0x000cc000-0x000dbfff window] Jan 13 21:04:20.744919 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jan 13 21:04:20.744964 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jan 13 21:04:20.745008 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xfeff window] Jan 13 21:04:20.745056 kernel: pci_bus 0000:00: root bus resource [bus 00-7f] Jan 13 21:04:20.745122 kernel: pci 0000:00:00.0: [8086:7190] type 00 class 0x060000 Jan 13 21:04:20.745187 kernel: pci 0000:00:01.0: [8086:7191] type 01 class 0x060400 Jan 13 21:04:20.745252 kernel: pci 0000:00:07.0: [8086:7110] type 00 class 0x060100 Jan 13 21:04:20.745308 kernel: pci 0000:00:07.1: [8086:7111] type 00 class 0x01018a Jan 13 21:04:20.745361 kernel: pci 0000:00:07.1: reg 0x20: [io 0x1060-0x106f] Jan 13 21:04:20.745426 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Jan 13 21:04:20.745478 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Jan 13 21:04:20.745531 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Jan 13 21:04:20.745580 kernel: pci 0000:00:07.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Jan 13 21:04:20.745641 kernel: pci 0000:00:07.3: [8086:7113] type 00 class 0x068000 Jan 13 21:04:20.745692 kernel: pci 0000:00:07.3: quirk: [io 0x1000-0x103f] claimed by PIIX4 ACPI Jan 13 21:04:20.745742 kernel: pci 0000:00:07.3: quirk: [io 0x1040-0x104f] claimed by PIIX4 SMB Jan 13 21:04:20.745796 kernel: pci 0000:00:07.7: [15ad:0740] type 00 class 0x088000 Jan 13 21:04:20.745862 kernel: pci 0000:00:07.7: reg 0x10: [io 0x1080-0x10bf] Jan 13 21:04:20.745938 kernel: pci 0000:00:07.7: reg 0x14: [mem 0xfebfe000-0xfebfffff 64bit] Jan 13 21:04:20.746002 kernel: pci 0000:00:0f.0: [15ad:0405] type 00 class 0x030000 Jan 13 21:04:20.746055 kernel: pci 0000:00:0f.0: reg 0x10: [io 0x1070-0x107f] Jan 13 21:04:20.746108 kernel: pci 0000:00:0f.0: reg 0x14: [mem 0xe8000000-0xefffffff pref] Jan 13 21:04:20.746178 kernel: pci 0000:00:0f.0: reg 0x18: [mem 0xfe000000-0xfe7fffff] Jan 13 21:04:20.746263 kernel: pci 0000:00:0f.0: reg 0x30: [mem 0x00000000-0x00007fff pref] Jan 13 21:04:20.746329 kernel: pci 0000:00:0f.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jan 13 21:04:20.746388 kernel: pci 0000:00:11.0: [15ad:0790] type 01 class 0x060401 Jan 13 21:04:20.746444 kernel: pci 0000:00:15.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746495 kernel: pci 0000:00:15.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746549 kernel: pci 0000:00:15.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746600 kernel: pci 0000:00:15.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746658 kernel: pci 0000:00:15.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746711 kernel: pci 0000:00:15.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746770 kernel: pci 0000:00:15.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.746822 kernel: pci 0000:00:15.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.746949 kernel: pci 0000:00:15.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747421 kernel: pci 0000:00:15.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747489 kernel: pci 0000:00:15.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747548 kernel: pci 0000:00:15.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747606 kernel: pci 0000:00:15.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747659 kernel: pci 0000:00:15.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747717 kernel: pci 0000:00:15.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747771 kernel: pci 0000:00:15.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747843 kernel: pci 0000:00:16.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.747904 kernel: pci 0000:00:16.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.747971 kernel: pci 0000:00:16.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748028 kernel: pci 0000:00:16.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748085 kernel: pci 0000:00:16.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748138 kernel: pci 0000:00:16.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748208 kernel: pci 0000:00:16.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748263 kernel: pci 0000:00:16.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748320 kernel: pci 0000:00:16.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748373 kernel: pci 0000:00:16.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748440 kernel: pci 0000:00:16.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748503 kernel: pci 0000:00:16.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748632 kernel: pci 0000:00:16.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748695 kernel: pci 0000:00:16.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748751 kernel: pci 0000:00:16.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748803 kernel: pci 0000:00:16.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748858 kernel: pci 0000:00:17.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.748909 kernel: pci 0000:00:17.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.748979 kernel: pci 0000:00:17.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749036 kernel: pci 0000:00:17.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749102 kernel: pci 0000:00:17.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749158 kernel: pci 0000:00:17.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749268 kernel: pci 0000:00:17.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749321 kernel: pci 0000:00:17.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749376 kernel: pci 0000:00:17.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749456 kernel: pci 0000:00:17.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749547 kernel: pci 0000:00:17.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749616 kernel: pci 0000:00:17.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749676 kernel: pci 0000:00:17.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749729 kernel: pci 0000:00:17.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749784 kernel: pci 0000:00:17.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749839 kernel: pci 0000:00:17.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.749893 kernel: pci 0000:00:18.0: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.749944 kernel: pci 0000:00:18.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750003 kernel: pci 0000:00:18.1: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750062 kernel: pci 0000:00:18.1: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750140 kernel: pci 0000:00:18.2: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750220 kernel: pci 0000:00:18.2: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750301 kernel: pci 0000:00:18.3: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750355 kernel: pci 0000:00:18.3: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750410 kernel: pci 0000:00:18.4: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750461 kernel: pci 0000:00:18.4: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750528 kernel: pci 0000:00:18.5: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750581 kernel: pci 0000:00:18.5: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750648 kernel: pci 0000:00:18.6: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750712 kernel: pci 0000:00:18.6: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750768 kernel: pci 0000:00:18.7: [15ad:07a0] type 01 class 0x060400 Jan 13 21:04:20.750818 kernel: pci 0000:00:18.7: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.750871 kernel: pci_bus 0000:01: extended config space not accessible Jan 13 21:04:20.750923 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 21:04:20.750977 kernel: pci_bus 0000:02: extended config space not accessible Jan 13 21:04:20.750986 kernel: acpiphp: Slot [32] registered Jan 13 21:04:20.750993 kernel: acpiphp: Slot [33] registered Jan 13 21:04:20.750999 kernel: acpiphp: Slot [34] registered Jan 13 21:04:20.751005 kernel: acpiphp: Slot [35] registered Jan 13 21:04:20.751014 kernel: acpiphp: Slot [36] registered Jan 13 21:04:20.751020 kernel: acpiphp: Slot [37] registered Jan 13 21:04:20.751026 kernel: acpiphp: Slot [38] registered Jan 13 21:04:20.751032 kernel: acpiphp: Slot [39] registered Jan 13 21:04:20.751040 kernel: acpiphp: Slot [40] registered Jan 13 21:04:20.751048 kernel: acpiphp: Slot [41] registered Jan 13 21:04:20.751056 kernel: acpiphp: Slot [42] registered Jan 13 21:04:20.751062 kernel: acpiphp: Slot [43] registered Jan 13 21:04:20.751068 kernel: acpiphp: Slot [44] registered Jan 13 21:04:20.751073 kernel: acpiphp: Slot [45] registered Jan 13 21:04:20.751079 kernel: acpiphp: Slot [46] registered Jan 13 21:04:20.751085 kernel: acpiphp: Slot [47] registered Jan 13 21:04:20.751091 kernel: acpiphp: Slot [48] registered Jan 13 21:04:20.751098 kernel: acpiphp: Slot [49] registered Jan 13 21:04:20.751104 kernel: acpiphp: Slot [50] registered Jan 13 21:04:20.751110 kernel: acpiphp: Slot [51] registered Jan 13 21:04:20.751115 kernel: acpiphp: Slot [52] registered Jan 13 21:04:20.751121 kernel: acpiphp: Slot [53] registered Jan 13 21:04:20.751127 kernel: acpiphp: Slot [54] registered Jan 13 21:04:20.751133 kernel: acpiphp: Slot [55] registered Jan 13 21:04:20.751139 kernel: acpiphp: Slot [56] registered Jan 13 21:04:20.751144 kernel: acpiphp: Slot [57] registered Jan 13 21:04:20.751150 kernel: acpiphp: Slot [58] registered Jan 13 21:04:20.751157 kernel: acpiphp: Slot [59] registered Jan 13 21:04:20.751163 kernel: acpiphp: Slot [60] registered Jan 13 21:04:20.751169 kernel: acpiphp: Slot [61] registered Jan 13 21:04:20.751176 kernel: acpiphp: Slot [62] registered Jan 13 21:04:20.751190 kernel: acpiphp: Slot [63] registered Jan 13 21:04:20.751257 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] (subtractive decode) Jan 13 21:04:20.751311 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 21:04:20.751360 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 21:04:20.751411 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 21:04:20.751463 kernel: pci 0000:00:11.0: bridge window [mem 0x000a0000-0x000bffff window] (subtractive decode) Jan 13 21:04:20.751514 kernel: pci 0000:00:11.0: bridge window [mem 0x000cc000-0x000dbfff window] (subtractive decode) Jan 13 21:04:20.751564 kernel: pci 0000:00:11.0: bridge window [mem 0xc0000000-0xfebfffff window] (subtractive decode) Jan 13 21:04:20.751618 kernel: pci 0000:00:11.0: bridge window [io 0x0000-0x0cf7 window] (subtractive decode) Jan 13 21:04:20.751668 kernel: pci 0000:00:11.0: bridge window [io 0x0d00-0xfeff window] (subtractive decode) Jan 13 21:04:20.751745 kernel: pci 0000:03:00.0: [15ad:07c0] type 00 class 0x010700 Jan 13 21:04:20.751816 kernel: pci 0000:03:00.0: reg 0x10: [io 0x4000-0x4007] Jan 13 21:04:20.751873 kernel: pci 0000:03:00.0: reg 0x14: [mem 0xfd5f8000-0xfd5fffff 64bit] Jan 13 21:04:20.751926 kernel: pci 0000:03:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 21:04:20.751977 kernel: pci 0000:03:00.0: PME# supported from D0 D3hot D3cold Jan 13 21:04:20.752033 kernel: pci 0000:03:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 21:04:20.752085 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 21:04:20.752138 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 21:04:20.752251 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 21:04:20.752321 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 21:04:20.752384 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 21:04:20.752435 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 21:04:20.752485 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 21:04:20.752563 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 21:04:20.752652 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 21:04:20.752733 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 21:04:20.752784 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 21:04:20.752842 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 21:04:20.752909 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 21:04:20.752961 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 21:04:20.753012 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 21:04:20.753062 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 21:04:20.753112 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 21:04:20.753166 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 21:04:20.753573 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 21:04:20.753628 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 21:04:20.753685 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 21:04:20.753745 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 21:04:20.753796 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 21:04:20.753853 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 21:04:20.753913 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 21:04:20.753964 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 21:04:20.754021 kernel: pci 0000:0b:00.0: [15ad:07b0] type 00 class 0x020000 Jan 13 21:04:20.754074 kernel: pci 0000:0b:00.0: reg 0x10: [mem 0xfd4fc000-0xfd4fcfff] Jan 13 21:04:20.754126 kernel: pci 0000:0b:00.0: reg 0x14: [mem 0xfd4fd000-0xfd4fdfff] Jan 13 21:04:20.754177 kernel: pci 0000:0b:00.0: reg 0x18: [mem 0xfd4fe000-0xfd4fffff] Jan 13 21:04:20.754255 kernel: pci 0000:0b:00.0: reg 0x1c: [io 0x5000-0x500f] Jan 13 21:04:20.754308 kernel: pci 0000:0b:00.0: reg 0x30: [mem 0x00000000-0x0000ffff pref] Jan 13 21:04:20.754361 kernel: pci 0000:0b:00.0: supports D1 D2 Jan 13 21:04:20.754416 kernel: pci 0000:0b:00.0: PME# supported from D0 D1 D2 D3hot D3cold Jan 13 21:04:20.754470 kernel: pci 0000:0b:00.0: disabling ASPM on pre-1.1 PCIe device. You can enable it with 'pcie_aspm=force' Jan 13 21:04:20.754521 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 21:04:20.754571 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 21:04:20.754621 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 21:04:20.754675 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 21:04:20.754725 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 21:04:20.754779 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 21:04:20.754836 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 21:04:20.754888 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 21:04:20.754952 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 21:04:20.755004 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 21:04:20.755054 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 21:04:20.755108 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 21:04:20.755158 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 21:04:20.755238 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 21:04:20.755296 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 21:04:20.755355 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 21:04:20.755409 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 21:04:20.755477 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 21:04:20.755527 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 21:04:20.757225 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 21:04:20.757296 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 21:04:20.757354 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 21:04:20.757410 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 21:04:20.757470 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 21:04:20.757523 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 21:04:20.757573 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 21:04:20.757629 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 21:04:20.757690 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 21:04:20.757741 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 21:04:20.757791 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 21:04:20.757868 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 21:04:20.757922 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 21:04:20.757976 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 21:04:20.758039 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 21:04:20.758096 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 21:04:20.758147 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 21:04:20.758233 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 21:04:20.758287 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 21:04:20.758339 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 21:04:20.758389 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 21:04:20.758440 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 21:04:20.758492 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 21:04:20.758550 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 21:04:20.758602 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 21:04:20.758655 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 21:04:20.758724 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 21:04:20.758778 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 21:04:20.758831 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 21:04:20.758882 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 21:04:20.758932 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 21:04:20.758987 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 21:04:20.759039 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 21:04:20.759106 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 21:04:20.759161 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 21:04:20.761240 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 21:04:20.761300 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 21:04:20.761354 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 21:04:20.761409 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 21:04:20.761464 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 21:04:20.761515 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 21:04:20.761568 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 21:04:20.761620 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 21:04:20.761671 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 21:04:20.761721 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 21:04:20.761773 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 21:04:20.761826 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 21:04:20.761875 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 21:04:20.761927 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 21:04:20.761977 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 21:04:20.762027 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 21:04:20.762079 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 21:04:20.762129 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 21:04:20.762178 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 21:04:20.762241 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 21:04:20.762292 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 21:04:20.762342 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 21:04:20.762393 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 21:04:20.762443 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 21:04:20.762493 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 21:04:20.762502 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 9 Jan 13 21:04:20.762508 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 0 Jan 13 21:04:20.762514 kernel: ACPI: PCI: Interrupt link LNKB disabled Jan 13 21:04:20.762522 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jan 13 21:04:20.762528 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 10 Jan 13 21:04:20.762534 kernel: iommu: Default domain type: Translated Jan 13 21:04:20.762540 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jan 13 21:04:20.762546 kernel: PCI: Using ACPI for IRQ routing Jan 13 21:04:20.762552 kernel: PCI: pci_cache_line_size set to 64 bytes Jan 13 21:04:20.762559 kernel: e820: reserve RAM buffer [mem 0x0009ec00-0x0009ffff] Jan 13 21:04:20.762565 kernel: e820: reserve RAM buffer [mem 0x7fee0000-0x7fffffff] Jan 13 21:04:20.762614 kernel: pci 0000:00:0f.0: vgaarb: setting as boot VGA device Jan 13 21:04:20.762667 kernel: pci 0000:00:0f.0: vgaarb: bridge control possible Jan 13 21:04:20.762718 kernel: pci 0000:00:0f.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jan 13 21:04:20.762727 kernel: vgaarb: loaded Jan 13 21:04:20.762734 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 Jan 13 21:04:20.762740 kernel: hpet0: 16 comparators, 64-bit 14.318180 MHz counter Jan 13 21:04:20.762746 kernel: clocksource: Switched to clocksource tsc-early Jan 13 21:04:20.762752 kernel: VFS: Disk quotas dquot_6.6.0 Jan 13 21:04:20.762758 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 13 21:04:20.762764 kernel: pnp: PnP ACPI init Jan 13 21:04:20.762822 kernel: system 00:00: [io 0x1000-0x103f] has been reserved Jan 13 21:04:20.762870 kernel: system 00:00: [io 0x1040-0x104f] has been reserved Jan 13 21:04:20.762916 kernel: system 00:00: [io 0x0cf0-0x0cf1] has been reserved Jan 13 21:04:20.762965 kernel: system 00:04: [mem 0xfed00000-0xfed003ff] has been reserved Jan 13 21:04:20.763015 kernel: pnp 00:06: [dma 2] Jan 13 21:04:20.763064 kernel: system 00:07: [io 0xfce0-0xfcff] has been reserved Jan 13 21:04:20.763113 kernel: system 00:07: [mem 0xf0000000-0xf7ffffff] has been reserved Jan 13 21:04:20.763158 kernel: system 00:07: [mem 0xfe800000-0xfe9fffff] has been reserved Jan 13 21:04:20.763167 kernel: pnp: PnP ACPI: found 8 devices Jan 13 21:04:20.763174 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jan 13 21:04:20.763179 kernel: NET: Registered PF_INET protocol family Jan 13 21:04:20.764595 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 13 21:04:20.764602 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jan 13 21:04:20.764611 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 13 21:04:20.764620 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jan 13 21:04:20.764626 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jan 13 21:04:20.764632 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jan 13 21:04:20.764638 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.764644 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jan 13 21:04:20.764650 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 13 21:04:20.764656 kernel: NET: Registered PF_XDP protocol family Jan 13 21:04:20.764722 kernel: pci 0000:00:15.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 03] add_size 200000 add_align 100000 Jan 13 21:04:20.764782 kernel: pci 0000:00:15.3: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 13 21:04:20.764837 kernel: pci 0000:00:15.4: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 13 21:04:20.764890 kernel: pci 0000:00:15.5: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 13 21:04:20.764943 kernel: pci 0000:00:15.6: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 13 21:04:20.764995 kernel: pci 0000:00:15.7: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Jan 13 21:04:20.765048 kernel: pci 0000:00:16.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Jan 13 21:04:20.765103 kernel: pci 0000:00:16.3: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Jan 13 21:04:20.765155 kernel: pci 0000:00:16.4: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Jan 13 21:04:20.765221 kernel: pci 0000:00:16.5: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Jan 13 21:04:20.765274 kernel: pci 0000:00:16.6: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Jan 13 21:04:20.765326 kernel: pci 0000:00:16.7: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Jan 13 21:04:20.765377 kernel: pci 0000:00:17.3: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Jan 13 21:04:20.765431 kernel: pci 0000:00:17.4: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Jan 13 21:04:20.765484 kernel: pci 0000:00:17.5: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Jan 13 21:04:20.765535 kernel: pci 0000:00:17.6: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Jan 13 21:04:20.765586 kernel: pci 0000:00:17.7: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Jan 13 21:04:20.765637 kernel: pci 0000:00:18.2: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Jan 13 21:04:20.765687 kernel: pci 0000:00:18.3: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Jan 13 21:04:20.765740 kernel: pci 0000:00:18.4: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Jan 13 21:04:20.765791 kernel: pci 0000:00:18.5: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Jan 13 21:04:20.765840 kernel: pci 0000:00:18.6: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Jan 13 21:04:20.765890 kernel: pci 0000:00:18.7: bridge window [io 0x1000-0x0fff] to [bus 22] add_size 1000 Jan 13 21:04:20.765941 kernel: pci 0000:00:15.0: BAR 15: assigned [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 21:04:20.765992 kernel: pci 0000:00:16.0: BAR 15: assigned [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 21:04:20.766045 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.766094 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.766145 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768215 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768271 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768322 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768372 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768422 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768476 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768527 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768577 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768627 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768677 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768727 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768777 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768827 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768880 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.768930 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.768979 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.769029 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.769080 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.769130 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.769179 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.769237 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.769290 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770280 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770334 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770384 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770434 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770484 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770534 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770585 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770638 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770687 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770737 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770786 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770836 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770886 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.770935 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.770984 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.771054 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772241 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772305 kernel: pci 0000:00:18.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772356 kernel: pci 0000:00:18.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772406 kernel: pci 0000:00:18.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772455 kernel: pci 0000:00:18.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772506 kernel: pci 0000:00:18.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772555 kernel: pci 0000:00:18.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772604 kernel: pci 0000:00:18.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772657 kernel: pci 0000:00:18.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772706 kernel: pci 0000:00:18.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772756 kernel: pci 0000:00:18.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772806 kernel: pci 0000:00:18.2: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772855 kernel: pci 0000:00:18.2: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.772906 kernel: pci 0000:00:17.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.772956 kernel: pci 0000:00:17.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773005 kernel: pci 0000:00:17.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773054 kernel: pci 0000:00:17.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773104 kernel: pci 0000:00:17.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773157 kernel: pci 0000:00:17.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773214 kernel: pci 0000:00:17.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773269 kernel: pci 0000:00:17.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773319 kernel: pci 0000:00:17.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773369 kernel: pci 0000:00:17.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773419 kernel: pci 0000:00:16.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773468 kernel: pci 0000:00:16.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773518 kernel: pci 0000:00:16.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773568 kernel: pci 0000:00:16.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773622 kernel: pci 0000:00:16.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773671 kernel: pci 0000:00:16.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773721 kernel: pci 0000:00:16.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773770 kernel: pci 0000:00:16.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773820 kernel: pci 0000:00:16.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773870 kernel: pci 0000:00:16.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.773920 kernel: pci 0000:00:15.7: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.773969 kernel: pci 0000:00:15.7: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.774019 kernel: pci 0000:00:15.6: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.774068 kernel: pci 0000:00:15.6: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.774120 kernel: pci 0000:00:15.5: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.774169 kernel: pci 0000:00:15.5: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.776878 kernel: pci 0000:00:15.4: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.776931 kernel: pci 0000:00:15.4: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.776982 kernel: pci 0000:00:15.3: BAR 13: no space for [io size 0x1000] Jan 13 21:04:20.777032 kernel: pci 0000:00:15.3: BAR 13: failed to assign [io size 0x1000] Jan 13 21:04:20.777083 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Jan 13 21:04:20.777134 kernel: pci 0000:00:11.0: PCI bridge to [bus 02] Jan 13 21:04:20.777235 kernel: pci 0000:00:11.0: bridge window [io 0x2000-0x3fff] Jan 13 21:04:20.777296 kernel: pci 0000:00:11.0: bridge window [mem 0xfd600000-0xfdffffff] Jan 13 21:04:20.777346 kernel: pci 0000:00:11.0: bridge window [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 21:04:20.777401 kernel: pci 0000:03:00.0: BAR 6: assigned [mem 0xfd500000-0xfd50ffff pref] Jan 13 21:04:20.777451 kernel: pci 0000:00:15.0: PCI bridge to [bus 03] Jan 13 21:04:20.777501 kernel: pci 0000:00:15.0: bridge window [io 0x4000-0x4fff] Jan 13 21:04:20.777551 kernel: pci 0000:00:15.0: bridge window [mem 0xfd500000-0xfd5fffff] Jan 13 21:04:20.777601 kernel: pci 0000:00:15.0: bridge window [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 21:04:20.777652 kernel: pci 0000:00:15.1: PCI bridge to [bus 04] Jan 13 21:04:20.777704 kernel: pci 0000:00:15.1: bridge window [io 0x8000-0x8fff] Jan 13 21:04:20.777754 kernel: pci 0000:00:15.1: bridge window [mem 0xfd100000-0xfd1fffff] Jan 13 21:04:20.777804 kernel: pci 0000:00:15.1: bridge window [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 21:04:20.777870 kernel: pci 0000:00:15.2: PCI bridge to [bus 05] Jan 13 21:04:20.777921 kernel: pci 0000:00:15.2: bridge window [io 0xc000-0xcfff] Jan 13 21:04:20.777971 kernel: pci 0000:00:15.2: bridge window [mem 0xfcd00000-0xfcdfffff] Jan 13 21:04:20.778020 kernel: pci 0000:00:15.2: bridge window [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 21:04:20.778070 kernel: pci 0000:00:15.3: PCI bridge to [bus 06] Jan 13 21:04:20.778119 kernel: pci 0000:00:15.3: bridge window [mem 0xfc900000-0xfc9fffff] Jan 13 21:04:20.778170 kernel: pci 0000:00:15.3: bridge window [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 21:04:20.778231 kernel: pci 0000:00:15.4: PCI bridge to [bus 07] Jan 13 21:04:20.778281 kernel: pci 0000:00:15.4: bridge window [mem 0xfc500000-0xfc5fffff] Jan 13 21:04:20.778331 kernel: pci 0000:00:15.4: bridge window [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 21:04:20.778383 kernel: pci 0000:00:15.5: PCI bridge to [bus 08] Jan 13 21:04:20.778433 kernel: pci 0000:00:15.5: bridge window [mem 0xfc100000-0xfc1fffff] Jan 13 21:04:20.778482 kernel: pci 0000:00:15.5: bridge window [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 21:04:20.778535 kernel: pci 0000:00:15.6: PCI bridge to [bus 09] Jan 13 21:04:20.778585 kernel: pci 0000:00:15.6: bridge window [mem 0xfbd00000-0xfbdfffff] Jan 13 21:04:20.778634 kernel: pci 0000:00:15.6: bridge window [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 21:04:20.778684 kernel: pci 0000:00:15.7: PCI bridge to [bus 0a] Jan 13 21:04:20.778734 kernel: pci 0000:00:15.7: bridge window [mem 0xfb900000-0xfb9fffff] Jan 13 21:04:20.778784 kernel: pci 0000:00:15.7: bridge window [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 21:04:20.778837 kernel: pci 0000:0b:00.0: BAR 6: assigned [mem 0xfd400000-0xfd40ffff pref] Jan 13 21:04:20.778888 kernel: pci 0000:00:16.0: PCI bridge to [bus 0b] Jan 13 21:04:20.778937 kernel: pci 0000:00:16.0: bridge window [io 0x5000-0x5fff] Jan 13 21:04:20.778990 kernel: pci 0000:00:16.0: bridge window [mem 0xfd400000-0xfd4fffff] Jan 13 21:04:20.779040 kernel: pci 0000:00:16.0: bridge window [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 21:04:20.779091 kernel: pci 0000:00:16.1: PCI bridge to [bus 0c] Jan 13 21:04:20.779142 kernel: pci 0000:00:16.1: bridge window [io 0x9000-0x9fff] Jan 13 21:04:20.779202 kernel: pci 0000:00:16.1: bridge window [mem 0xfd000000-0xfd0fffff] Jan 13 21:04:20.779269 kernel: pci 0000:00:16.1: bridge window [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 21:04:20.779321 kernel: pci 0000:00:16.2: PCI bridge to [bus 0d] Jan 13 21:04:20.779371 kernel: pci 0000:00:16.2: bridge window [io 0xd000-0xdfff] Jan 13 21:04:20.779421 kernel: pci 0000:00:16.2: bridge window [mem 0xfcc00000-0xfccfffff] Jan 13 21:04:20.779475 kernel: pci 0000:00:16.2: bridge window [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 21:04:20.779525 kernel: pci 0000:00:16.3: PCI bridge to [bus 0e] Jan 13 21:04:20.779575 kernel: pci 0000:00:16.3: bridge window [mem 0xfc800000-0xfc8fffff] Jan 13 21:04:20.779626 kernel: pci 0000:00:16.3: bridge window [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 21:04:20.779677 kernel: pci 0000:00:16.4: PCI bridge to [bus 0f] Jan 13 21:04:20.779728 kernel: pci 0000:00:16.4: bridge window [mem 0xfc400000-0xfc4fffff] Jan 13 21:04:20.779778 kernel: pci 0000:00:16.4: bridge window [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 21:04:20.779829 kernel: pci 0000:00:16.5: PCI bridge to [bus 10] Jan 13 21:04:20.779879 kernel: pci 0000:00:16.5: bridge window [mem 0xfc000000-0xfc0fffff] Jan 13 21:04:20.779930 kernel: pci 0000:00:16.5: bridge window [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 21:04:20.779982 kernel: pci 0000:00:16.6: PCI bridge to [bus 11] Jan 13 21:04:20.780032 kernel: pci 0000:00:16.6: bridge window [mem 0xfbc00000-0xfbcfffff] Jan 13 21:04:20.780082 kernel: pci 0000:00:16.6: bridge window [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 21:04:20.780132 kernel: pci 0000:00:16.7: PCI bridge to [bus 12] Jan 13 21:04:20.780196 kernel: pci 0000:00:16.7: bridge window [mem 0xfb800000-0xfb8fffff] Jan 13 21:04:20.780257 kernel: pci 0000:00:16.7: bridge window [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 21:04:20.780308 kernel: pci 0000:00:17.0: PCI bridge to [bus 13] Jan 13 21:04:20.780358 kernel: pci 0000:00:17.0: bridge window [io 0x6000-0x6fff] Jan 13 21:04:20.780408 kernel: pci 0000:00:17.0: bridge window [mem 0xfd300000-0xfd3fffff] Jan 13 21:04:20.780460 kernel: pci 0000:00:17.0: bridge window [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 21:04:20.780511 kernel: pci 0000:00:17.1: PCI bridge to [bus 14] Jan 13 21:04:20.780562 kernel: pci 0000:00:17.1: bridge window [io 0xa000-0xafff] Jan 13 21:04:20.780612 kernel: pci 0000:00:17.1: bridge window [mem 0xfcf00000-0xfcffffff] Jan 13 21:04:20.780662 kernel: pci 0000:00:17.1: bridge window [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 21:04:20.780713 kernel: pci 0000:00:17.2: PCI bridge to [bus 15] Jan 13 21:04:20.780763 kernel: pci 0000:00:17.2: bridge window [io 0xe000-0xefff] Jan 13 21:04:20.780813 kernel: pci 0000:00:17.2: bridge window [mem 0xfcb00000-0xfcbfffff] Jan 13 21:04:20.780865 kernel: pci 0000:00:17.2: bridge window [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 21:04:20.780916 kernel: pci 0000:00:17.3: PCI bridge to [bus 16] Jan 13 21:04:20.780970 kernel: pci 0000:00:17.3: bridge window [mem 0xfc700000-0xfc7fffff] Jan 13 21:04:20.781022 kernel: pci 0000:00:17.3: bridge window [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 21:04:20.781072 kernel: pci 0000:00:17.4: PCI bridge to [bus 17] Jan 13 21:04:20.782277 kernel: pci 0000:00:17.4: bridge window [mem 0xfc300000-0xfc3fffff] Jan 13 21:04:20.782335 kernel: pci 0000:00:17.4: bridge window [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 21:04:20.782388 kernel: pci 0000:00:17.5: PCI bridge to [bus 18] Jan 13 21:04:20.782439 kernel: pci 0000:00:17.5: bridge window [mem 0xfbf00000-0xfbffffff] Jan 13 21:04:20.782491 kernel: pci 0000:00:17.5: bridge window [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 21:04:20.782541 kernel: pci 0000:00:17.6: PCI bridge to [bus 19] Jan 13 21:04:20.782595 kernel: pci 0000:00:17.6: bridge window [mem 0xfbb00000-0xfbbfffff] Jan 13 21:04:20.782646 kernel: pci 0000:00:17.6: bridge window [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 21:04:20.782695 kernel: pci 0000:00:17.7: PCI bridge to [bus 1a] Jan 13 21:04:20.782746 kernel: pci 0000:00:17.7: bridge window [mem 0xfb700000-0xfb7fffff] Jan 13 21:04:20.782795 kernel: pci 0000:00:17.7: bridge window [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 21:04:20.782847 kernel: pci 0000:00:18.0: PCI bridge to [bus 1b] Jan 13 21:04:20.782897 kernel: pci 0000:00:18.0: bridge window [io 0x7000-0x7fff] Jan 13 21:04:20.782947 kernel: pci 0000:00:18.0: bridge window [mem 0xfd200000-0xfd2fffff] Jan 13 21:04:20.782997 kernel: pci 0000:00:18.0: bridge window [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 21:04:20.783048 kernel: pci 0000:00:18.1: PCI bridge to [bus 1c] Jan 13 21:04:20.783100 kernel: pci 0000:00:18.1: bridge window [io 0xb000-0xbfff] Jan 13 21:04:20.783150 kernel: pci 0000:00:18.1: bridge window [mem 0xfce00000-0xfcefffff] Jan 13 21:04:20.783213 kernel: pci 0000:00:18.1: bridge window [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 21:04:20.783274 kernel: pci 0000:00:18.2: PCI bridge to [bus 1d] Jan 13 21:04:20.783325 kernel: pci 0000:00:18.2: bridge window [mem 0xfca00000-0xfcafffff] Jan 13 21:04:20.783374 kernel: pci 0000:00:18.2: bridge window [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 21:04:20.783424 kernel: pci 0000:00:18.3: PCI bridge to [bus 1e] Jan 13 21:04:20.783474 kernel: pci 0000:00:18.3: bridge window [mem 0xfc600000-0xfc6fffff] Jan 13 21:04:20.783524 kernel: pci 0000:00:18.3: bridge window [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 21:04:20.783577 kernel: pci 0000:00:18.4: PCI bridge to [bus 1f] Jan 13 21:04:20.783628 kernel: pci 0000:00:18.4: bridge window [mem 0xfc200000-0xfc2fffff] Jan 13 21:04:20.783678 kernel: pci 0000:00:18.4: bridge window [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 21:04:20.783728 kernel: pci 0000:00:18.5: PCI bridge to [bus 20] Jan 13 21:04:20.783778 kernel: pci 0000:00:18.5: bridge window [mem 0xfbe00000-0xfbefffff] Jan 13 21:04:20.783828 kernel: pci 0000:00:18.5: bridge window [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 21:04:20.783878 kernel: pci 0000:00:18.6: PCI bridge to [bus 21] Jan 13 21:04:20.783929 kernel: pci 0000:00:18.6: bridge window [mem 0xfba00000-0xfbafffff] Jan 13 21:04:20.783979 kernel: pci 0000:00:18.6: bridge window [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 21:04:20.784030 kernel: pci 0000:00:18.7: PCI bridge to [bus 22] Jan 13 21:04:20.784083 kernel: pci 0000:00:18.7: bridge window [mem 0xfb600000-0xfb6fffff] Jan 13 21:04:20.784133 kernel: pci 0000:00:18.7: bridge window [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 21:04:20.785193 kernel: pci_bus 0000:00: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 21:04:20.785254 kernel: pci_bus 0000:00: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 21:04:20.785303 kernel: pci_bus 0000:00: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 21:04:20.785348 kernel: pci_bus 0000:00: resource 7 [io 0x0000-0x0cf7 window] Jan 13 21:04:20.785393 kernel: pci_bus 0000:00: resource 8 [io 0x0d00-0xfeff window] Jan 13 21:04:20.785442 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x3fff] Jan 13 21:04:20.785493 kernel: pci_bus 0000:02: resource 1 [mem 0xfd600000-0xfdffffff] Jan 13 21:04:20.785539 kernel: pci_bus 0000:02: resource 2 [mem 0xe7b00000-0xe7ffffff 64bit pref] Jan 13 21:04:20.785585 kernel: pci_bus 0000:02: resource 4 [mem 0x000a0000-0x000bffff window] Jan 13 21:04:20.785630 kernel: pci_bus 0000:02: resource 5 [mem 0x000cc000-0x000dbfff window] Jan 13 21:04:20.785677 kernel: pci_bus 0000:02: resource 6 [mem 0xc0000000-0xfebfffff window] Jan 13 21:04:20.785723 kernel: pci_bus 0000:02: resource 7 [io 0x0000-0x0cf7 window] Jan 13 21:04:20.785768 kernel: pci_bus 0000:02: resource 8 [io 0x0d00-0xfeff window] Jan 13 21:04:20.785822 kernel: pci_bus 0000:03: resource 0 [io 0x4000-0x4fff] Jan 13 21:04:20.785869 kernel: pci_bus 0000:03: resource 1 [mem 0xfd500000-0xfd5fffff] Jan 13 21:04:20.785915 kernel: pci_bus 0000:03: resource 2 [mem 0xc0000000-0xc01fffff 64bit pref] Jan 13 21:04:20.785966 kernel: pci_bus 0000:04: resource 0 [io 0x8000-0x8fff] Jan 13 21:04:20.786012 kernel: pci_bus 0000:04: resource 1 [mem 0xfd100000-0xfd1fffff] Jan 13 21:04:20.786058 kernel: pci_bus 0000:04: resource 2 [mem 0xe7800000-0xe78fffff 64bit pref] Jan 13 21:04:20.786107 kernel: pci_bus 0000:05: resource 0 [io 0xc000-0xcfff] Jan 13 21:04:20.786156 kernel: pci_bus 0000:05: resource 1 [mem 0xfcd00000-0xfcdfffff] Jan 13 21:04:20.786532 kernel: pci_bus 0000:05: resource 2 [mem 0xe7400000-0xe74fffff 64bit pref] Jan 13 21:04:20.786589 kernel: pci_bus 0000:06: resource 1 [mem 0xfc900000-0xfc9fffff] Jan 13 21:04:20.786637 kernel: pci_bus 0000:06: resource 2 [mem 0xe7000000-0xe70fffff 64bit pref] Jan 13 21:04:20.786690 kernel: pci_bus 0000:07: resource 1 [mem 0xfc500000-0xfc5fffff] Jan 13 21:04:20.786737 kernel: pci_bus 0000:07: resource 2 [mem 0xe6c00000-0xe6cfffff 64bit pref] Jan 13 21:04:20.786790 kernel: pci_bus 0000:08: resource 1 [mem 0xfc100000-0xfc1fffff] Jan 13 21:04:20.786838 kernel: pci_bus 0000:08: resource 2 [mem 0xe6800000-0xe68fffff 64bit pref] Jan 13 21:04:20.786888 kernel: pci_bus 0000:09: resource 1 [mem 0xfbd00000-0xfbdfffff] Jan 13 21:04:20.786935 kernel: pci_bus 0000:09: resource 2 [mem 0xe6400000-0xe64fffff 64bit pref] Jan 13 21:04:20.786987 kernel: pci_bus 0000:0a: resource 1 [mem 0xfb900000-0xfb9fffff] Jan 13 21:04:20.787043 kernel: pci_bus 0000:0a: resource 2 [mem 0xe6000000-0xe60fffff 64bit pref] Jan 13 21:04:20.787097 kernel: pci_bus 0000:0b: resource 0 [io 0x5000-0x5fff] Jan 13 21:04:20.787145 kernel: pci_bus 0000:0b: resource 1 [mem 0xfd400000-0xfd4fffff] Jan 13 21:04:20.787230 kernel: pci_bus 0000:0b: resource 2 [mem 0xc0200000-0xc03fffff 64bit pref] Jan 13 21:04:20.787287 kernel: pci_bus 0000:0c: resource 0 [io 0x9000-0x9fff] Jan 13 21:04:20.787335 kernel: pci_bus 0000:0c: resource 1 [mem 0xfd000000-0xfd0fffff] Jan 13 21:04:20.787383 kernel: pci_bus 0000:0c: resource 2 [mem 0xe7700000-0xe77fffff 64bit pref] Jan 13 21:04:20.787434 kernel: pci_bus 0000:0d: resource 0 [io 0xd000-0xdfff] Jan 13 21:04:20.787485 kernel: pci_bus 0000:0d: resource 1 [mem 0xfcc00000-0xfccfffff] Jan 13 21:04:20.787534 kernel: pci_bus 0000:0d: resource 2 [mem 0xe7300000-0xe73fffff 64bit pref] Jan 13 21:04:20.787585 kernel: pci_bus 0000:0e: resource 1 [mem 0xfc800000-0xfc8fffff] Jan 13 21:04:20.787632 kernel: pci_bus 0000:0e: resource 2 [mem 0xe6f00000-0xe6ffffff 64bit pref] Jan 13 21:04:20.787685 kernel: pci_bus 0000:0f: resource 1 [mem 0xfc400000-0xfc4fffff] Jan 13 21:04:20.787732 kernel: pci_bus 0000:0f: resource 2 [mem 0xe6b00000-0xe6bfffff 64bit pref] Jan 13 21:04:20.787784 kernel: pci_bus 0000:10: resource 1 [mem 0xfc000000-0xfc0fffff] Jan 13 21:04:20.787831 kernel: pci_bus 0000:10: resource 2 [mem 0xe6700000-0xe67fffff 64bit pref] Jan 13 21:04:20.787885 kernel: pci_bus 0000:11: resource 1 [mem 0xfbc00000-0xfbcfffff] Jan 13 21:04:20.787931 kernel: pci_bus 0000:11: resource 2 [mem 0xe6300000-0xe63fffff 64bit pref] Jan 13 21:04:20.787981 kernel: pci_bus 0000:12: resource 1 [mem 0xfb800000-0xfb8fffff] Jan 13 21:04:20.788028 kernel: pci_bus 0000:12: resource 2 [mem 0xe5f00000-0xe5ffffff 64bit pref] Jan 13 21:04:20.788080 kernel: pci_bus 0000:13: resource 0 [io 0x6000-0x6fff] Jan 13 21:04:20.788127 kernel: pci_bus 0000:13: resource 1 [mem 0xfd300000-0xfd3fffff] Jan 13 21:04:20.788174 kernel: pci_bus 0000:13: resource 2 [mem 0xe7a00000-0xe7afffff 64bit pref] Jan 13 21:04:20.788279 kernel: pci_bus 0000:14: resource 0 [io 0xa000-0xafff] Jan 13 21:04:20.788327 kernel: pci_bus 0000:14: resource 1 [mem 0xfcf00000-0xfcffffff] Jan 13 21:04:20.788374 kernel: pci_bus 0000:14: resource 2 [mem 0xe7600000-0xe76fffff 64bit pref] Jan 13 21:04:20.788424 kernel: pci_bus 0000:15: resource 0 [io 0xe000-0xefff] Jan 13 21:04:20.788474 kernel: pci_bus 0000:15: resource 1 [mem 0xfcb00000-0xfcbfffff] Jan 13 21:04:20.788520 kernel: pci_bus 0000:15: resource 2 [mem 0xe7200000-0xe72fffff 64bit pref] Jan 13 21:04:20.788570 kernel: pci_bus 0000:16: resource 1 [mem 0xfc700000-0xfc7fffff] Jan 13 21:04:20.788618 kernel: pci_bus 0000:16: resource 2 [mem 0xe6e00000-0xe6efffff 64bit pref] Jan 13 21:04:20.788667 kernel: pci_bus 0000:17: resource 1 [mem 0xfc300000-0xfc3fffff] Jan 13 21:04:20.788714 kernel: pci_bus 0000:17: resource 2 [mem 0xe6a00000-0xe6afffff 64bit pref] Jan 13 21:04:20.788770 kernel: pci_bus 0000:18: resource 1 [mem 0xfbf00000-0xfbffffff] Jan 13 21:04:20.788817 kernel: pci_bus 0000:18: resource 2 [mem 0xe6600000-0xe66fffff 64bit pref] Jan 13 21:04:20.788867 kernel: pci_bus 0000:19: resource 1 [mem 0xfbb00000-0xfbbfffff] Jan 13 21:04:20.788914 kernel: pci_bus 0000:19: resource 2 [mem 0xe6200000-0xe62fffff 64bit pref] Jan 13 21:04:20.788965 kernel: pci_bus 0000:1a: resource 1 [mem 0xfb700000-0xfb7fffff] Jan 13 21:04:20.789012 kernel: pci_bus 0000:1a: resource 2 [mem 0xe5e00000-0xe5efffff 64bit pref] Jan 13 21:04:20.789065 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Jan 13 21:04:20.789112 kernel: pci_bus 0000:1b: resource 1 [mem 0xfd200000-0xfd2fffff] Jan 13 21:04:20.789159 kernel: pci_bus 0000:1b: resource 2 [mem 0xe7900000-0xe79fffff 64bit pref] Jan 13 21:04:20.789217 kernel: pci_bus 0000:1c: resource 0 [io 0xb000-0xbfff] Jan 13 21:04:20.789266 kernel: pci_bus 0000:1c: resource 1 [mem 0xfce00000-0xfcefffff] Jan 13 21:04:20.789312 kernel: pci_bus 0000:1c: resource 2 [mem 0xe7500000-0xe75fffff 64bit pref] Jan 13 21:04:20.789362 kernel: pci_bus 0000:1d: resource 1 [mem 0xfca00000-0xfcafffff] Jan 13 21:04:20.789412 kernel: pci_bus 0000:1d: resource 2 [mem 0xe7100000-0xe71fffff 64bit pref] Jan 13 21:04:20.789464 kernel: pci_bus 0000:1e: resource 1 [mem 0xfc600000-0xfc6fffff] Jan 13 21:04:20.789511 kernel: pci_bus 0000:1e: resource 2 [mem 0xe6d00000-0xe6dfffff 64bit pref] Jan 13 21:04:20.789561 kernel: pci_bus 0000:1f: resource 1 [mem 0xfc200000-0xfc2fffff] Jan 13 21:04:20.789609 kernel: pci_bus 0000:1f: resource 2 [mem 0xe6900000-0xe69fffff 64bit pref] Jan 13 21:04:20.789661 kernel: pci_bus 0000:20: resource 1 [mem 0xfbe00000-0xfbefffff] Jan 13 21:04:20.789711 kernel: pci_bus 0000:20: resource 2 [mem 0xe6500000-0xe65fffff 64bit pref] Jan 13 21:04:20.789761 kernel: pci_bus 0000:21: resource 1 [mem 0xfba00000-0xfbafffff] Jan 13 21:04:20.789811 kernel: pci_bus 0000:21: resource 2 [mem 0xe6100000-0xe61fffff 64bit pref] Jan 13 21:04:20.789861 kernel: pci_bus 0000:22: resource 1 [mem 0xfb600000-0xfb6fffff] Jan 13 21:04:20.789909 kernel: pci_bus 0000:22: resource 2 [mem 0xe5d00000-0xe5dfffff 64bit pref] Jan 13 21:04:20.789965 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Jan 13 21:04:20.789977 kernel: PCI: CLS 32 bytes, default 64 Jan 13 21:04:20.789984 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Jan 13 21:04:20.789991 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x311fd3cd494, max_idle_ns: 440795223879 ns Jan 13 21:04:20.789997 kernel: clocksource: Switched to clocksource tsc Jan 13 21:04:20.790003 kernel: Initialise system trusted keyrings Jan 13 21:04:20.790010 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jan 13 21:04:20.790016 kernel: Key type asymmetric registered Jan 13 21:04:20.790022 kernel: Asymmetric key parser 'x509' registered Jan 13 21:04:20.790028 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) Jan 13 21:04:20.790036 kernel: io scheduler mq-deadline registered Jan 13 21:04:20.790043 kernel: io scheduler kyber registered Jan 13 21:04:20.790049 kernel: io scheduler bfq registered Jan 13 21:04:20.790102 kernel: pcieport 0000:00:15.0: PME: Signaling with IRQ 24 Jan 13 21:04:20.790155 kernel: pcieport 0000:00:15.0: pciehp: Slot #160 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790556 kernel: pcieport 0000:00:15.1: PME: Signaling with IRQ 25 Jan 13 21:04:20.790614 kernel: pcieport 0000:00:15.1: pciehp: Slot #161 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790668 kernel: pcieport 0000:00:15.2: PME: Signaling with IRQ 26 Jan 13 21:04:20.790724 kernel: pcieport 0000:00:15.2: pciehp: Slot #162 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790776 kernel: pcieport 0000:00:15.3: PME: Signaling with IRQ 27 Jan 13 21:04:20.790828 kernel: pcieport 0000:00:15.3: pciehp: Slot #163 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790879 kernel: pcieport 0000:00:15.4: PME: Signaling with IRQ 28 Jan 13 21:04:20.790930 kernel: pcieport 0000:00:15.4: pciehp: Slot #164 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.790981 kernel: pcieport 0000:00:15.5: PME: Signaling with IRQ 29 Jan 13 21:04:20.791036 kernel: pcieport 0000:00:15.5: pciehp: Slot #165 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791087 kernel: pcieport 0000:00:15.6: PME: Signaling with IRQ 30 Jan 13 21:04:20.791139 kernel: pcieport 0000:00:15.6: pciehp: Slot #166 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791424 kernel: pcieport 0000:00:15.7: PME: Signaling with IRQ 31 Jan 13 21:04:20.791480 kernel: pcieport 0000:00:15.7: pciehp: Slot #167 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791536 kernel: pcieport 0000:00:16.0: PME: Signaling with IRQ 32 Jan 13 21:04:20.791588 kernel: pcieport 0000:00:16.0: pciehp: Slot #192 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791640 kernel: pcieport 0000:00:16.1: PME: Signaling with IRQ 33 Jan 13 21:04:20.791692 kernel: pcieport 0000:00:16.1: pciehp: Slot #193 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791743 kernel: pcieport 0000:00:16.2: PME: Signaling with IRQ 34 Jan 13 21:04:20.791795 kernel: pcieport 0000:00:16.2: pciehp: Slot #194 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791846 kernel: pcieport 0000:00:16.3: PME: Signaling with IRQ 35 Jan 13 21:04:20.791899 kernel: pcieport 0000:00:16.3: pciehp: Slot #195 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.791950 kernel: pcieport 0000:00:16.4: PME: Signaling with IRQ 36 Jan 13 21:04:20.792002 kernel: pcieport 0000:00:16.4: pciehp: Slot #196 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.792053 kernel: pcieport 0000:00:16.5: PME: Signaling with IRQ 37 Jan 13 21:04:20.792104 kernel: pcieport 0000:00:16.5: pciehp: Slot #197 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.792159 kernel: pcieport 0000:00:16.6: PME: Signaling with IRQ 38 Jan 13 21:04:20.792304 kernel: pcieport 0000:00:16.6: pciehp: Slot #198 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793280 kernel: pcieport 0000:00:16.7: PME: Signaling with IRQ 39 Jan 13 21:04:20.793341 kernel: pcieport 0000:00:16.7: pciehp: Slot #199 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793395 kernel: pcieport 0000:00:17.0: PME: Signaling with IRQ 40 Jan 13 21:04:20.793448 kernel: pcieport 0000:00:17.0: pciehp: Slot #224 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793503 kernel: pcieport 0000:00:17.1: PME: Signaling with IRQ 41 Jan 13 21:04:20.793554 kernel: pcieport 0000:00:17.1: pciehp: Slot #225 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793605 kernel: pcieport 0000:00:17.2: PME: Signaling with IRQ 42 Jan 13 21:04:20.793656 kernel: pcieport 0000:00:17.2: pciehp: Slot #226 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793707 kernel: pcieport 0000:00:17.3: PME: Signaling with IRQ 43 Jan 13 21:04:20.793758 kernel: pcieport 0000:00:17.3: pciehp: Slot #227 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793808 kernel: pcieport 0000:00:17.4: PME: Signaling with IRQ 44 Jan 13 21:04:20.793862 kernel: pcieport 0000:00:17.4: pciehp: Slot #228 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.793913 kernel: pcieport 0000:00:17.5: PME: Signaling with IRQ 45 Jan 13 21:04:20.793964 kernel: pcieport 0000:00:17.5: pciehp: Slot #229 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.794014 kernel: pcieport 0000:00:17.6: PME: Signaling with IRQ 46 Jan 13 21:04:20.794066 kernel: pcieport 0000:00:17.6: pciehp: Slot #230 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.794120 kernel: pcieport 0000:00:17.7: PME: Signaling with IRQ 47 Jan 13 21:04:20.794172 kernel: pcieport 0000:00:17.7: pciehp: Slot #231 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795270 kernel: pcieport 0000:00:18.0: PME: Signaling with IRQ 48 Jan 13 21:04:20.795328 kernel: pcieport 0000:00:18.0: pciehp: Slot #256 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795382 kernel: pcieport 0000:00:18.1: PME: Signaling with IRQ 49 Jan 13 21:04:20.795434 kernel: pcieport 0000:00:18.1: pciehp: Slot #257 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795489 kernel: pcieport 0000:00:18.2: PME: Signaling with IRQ 50 Jan 13 21:04:20.795539 kernel: pcieport 0000:00:18.2: pciehp: Slot #258 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795590 kernel: pcieport 0000:00:18.3: PME: Signaling with IRQ 51 Jan 13 21:04:20.795641 kernel: pcieport 0000:00:18.3: pciehp: Slot #259 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795693 kernel: pcieport 0000:00:18.4: PME: Signaling with IRQ 52 Jan 13 21:04:20.795747 kernel: pcieport 0000:00:18.4: pciehp: Slot #260 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795798 kernel: pcieport 0000:00:18.5: PME: Signaling with IRQ 53 Jan 13 21:04:20.795849 kernel: pcieport 0000:00:18.5: pciehp: Slot #261 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.795899 kernel: pcieport 0000:00:18.6: PME: Signaling with IRQ 54 Jan 13 21:04:20.795950 kernel: pcieport 0000:00:18.6: pciehp: Slot #262 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.796000 kernel: pcieport 0000:00:18.7: PME: Signaling with IRQ 55 Jan 13 21:04:20.796054 kernel: pcieport 0000:00:18.7: pciehp: Slot #263 AttnBtn+ PwrCtrl+ MRL- AttnInd- PwrInd- HotPlug+ Surprise- Interlock- NoCompl+ IbPresDis- LLActRep+ Jan 13 21:04:20.796064 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jan 13 21:04:20.796070 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 13 21:04:20.796077 kernel: 00:05: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jan 13 21:04:20.796083 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBC,PNP0f13:MOUS] at 0x60,0x64 irq 1,12 Jan 13 21:04:20.796090 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jan 13 21:04:20.796096 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jan 13 21:04:20.796149 kernel: rtc_cmos 00:01: registered as rtc0 Jan 13 21:04:20.796534 kernel: rtc_cmos 00:01: setting system clock to 2025-01-13T21:04:20 UTC (1736802260) Jan 13 21:04:20.796587 kernel: rtc_cmos 00:01: alarms up to one month, y3k, 114 bytes nvram Jan 13 21:04:20.796597 kernel: intel_pstate: CPU model not supported Jan 13 21:04:20.796604 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jan 13 21:04:20.796610 kernel: NET: Registered PF_INET6 protocol family Jan 13 21:04:20.796617 kernel: Segment Routing with IPv6 Jan 13 21:04:20.796623 kernel: In-situ OAM (IOAM) with IPv6 Jan 13 21:04:20.796632 kernel: NET: Registered PF_PACKET protocol family Jan 13 21:04:20.796638 kernel: Key type dns_resolver registered Jan 13 21:04:20.796645 kernel: IPI shorthand broadcast: enabled Jan 13 21:04:20.796651 kernel: sched_clock: Marking stable (885446644, 225799539)->(1171383470, -60137287) Jan 13 21:04:20.796657 kernel: registered taskstats version 1 Jan 13 21:04:20.796663 kernel: Loading compiled-in X.509 certificates Jan 13 21:04:20.796669 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.71-flatcar: 98739e9049f62881f4df7ffd1e39335f7f55b344' Jan 13 21:04:20.796676 kernel: Key type .fscrypt registered Jan 13 21:04:20.796683 kernel: Key type fscrypt-provisioning registered Jan 13 21:04:20.796690 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 13 21:04:20.796697 kernel: ima: Allocated hash algorithm: sha1 Jan 13 21:04:20.796703 kernel: ima: No architecture policies found Jan 13 21:04:20.796709 kernel: clk: Disabling unused clocks Jan 13 21:04:20.796715 kernel: Freeing unused kernel image (initmem) memory: 42976K Jan 13 21:04:20.796723 kernel: Write protecting the kernel read-only data: 36864k Jan 13 21:04:20.796729 kernel: Freeing unused kernel image (rodata/data gap) memory: 1840K Jan 13 21:04:20.796736 kernel: Run /init as init process Jan 13 21:04:20.796742 kernel: with arguments: Jan 13 21:04:20.796750 kernel: /init Jan 13 21:04:20.796756 kernel: with environment: Jan 13 21:04:20.796762 kernel: HOME=/ Jan 13 21:04:20.796768 kernel: TERM=linux Jan 13 21:04:20.796774 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jan 13 21:04:20.796781 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 21:04:20.796789 systemd[1]: Detected virtualization vmware. Jan 13 21:04:20.796796 systemd[1]: Detected architecture x86-64. Jan 13 21:04:20.796803 systemd[1]: Running in initrd. Jan 13 21:04:20.796810 systemd[1]: No hostname configured, using default hostname. Jan 13 21:04:20.796817 systemd[1]: Hostname set to . Jan 13 21:04:20.796824 systemd[1]: Initializing machine ID from random generator. Jan 13 21:04:20.796830 systemd[1]: Queued start job for default target initrd.target. Jan 13 21:04:20.796836 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:04:20.796843 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:04:20.796850 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 13 21:04:20.796858 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 21:04:20.796865 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 13 21:04:20.796871 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 13 21:04:20.796879 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jan 13 21:04:20.796886 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jan 13 21:04:20.796892 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:04:20.796899 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:04:20.796907 systemd[1]: Reached target paths.target - Path Units. Jan 13 21:04:20.796913 systemd[1]: Reached target slices.target - Slice Units. Jan 13 21:04:20.796920 systemd[1]: Reached target swap.target - Swaps. Jan 13 21:04:20.796926 systemd[1]: Reached target timers.target - Timer Units. Jan 13 21:04:20.796933 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 21:04:20.796939 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 21:04:20.796945 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 21:04:20.796952 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 21:04:20.796958 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:04:20.796966 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 21:04:20.796973 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:04:20.796979 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 21:04:20.796986 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 13 21:04:20.796992 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 21:04:20.796999 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 13 21:04:20.797005 systemd[1]: Starting systemd-fsck-usr.service... Jan 13 21:04:20.797012 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 21:04:20.797019 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 21:04:20.797026 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:04:20.797230 systemd-journald[217]: Collecting audit messages is disabled. Jan 13 21:04:20.797250 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 13 21:04:20.797260 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:04:20.797266 systemd[1]: Finished systemd-fsck-usr.service. Jan 13 21:04:20.797273 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 21:04:20.797280 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 13 21:04:20.797288 kernel: Bridge firewalling registered Jan 13 21:04:20.797295 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 21:04:20.797302 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 21:04:20.797309 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:20.797315 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:04:20.797322 systemd-journald[217]: Journal started Jan 13 21:04:20.797337 systemd-journald[217]: Runtime Journal (/run/log/journal/1d073833aaed44bc9a4bb9c5dbd400a3) is 4.8M, max 38.7M, 33.8M free. Jan 13 21:04:20.748649 systemd-modules-load[218]: Inserted module 'overlay' Jan 13 21:04:20.799657 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:04:20.772263 systemd-modules-load[218]: Inserted module 'br_netfilter' Jan 13 21:04:20.802319 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 21:04:20.802341 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 21:04:20.802640 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:04:20.808329 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 21:04:20.808541 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:04:20.818415 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:20.820280 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 13 21:04:20.822376 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:04:20.824330 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 21:04:20.829090 dracut-cmdline[248]: dracut-dracut-053 Jan 13 21:04:20.832988 dracut-cmdline[248]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=vmware flatcar.autologin verity.usrhash=1175b5bd4028ce8485b23b7d346f787308cbfa43cca7b1fefd4254406dce7d07 Jan 13 21:04:20.847408 systemd-resolved[250]: Positive Trust Anchors: Jan 13 21:04:20.847418 systemd-resolved[250]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 21:04:20.847446 systemd-resolved[250]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 21:04:20.849187 systemd-resolved[250]: Defaulting to hostname 'linux'. Jan 13 21:04:20.849859 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 21:04:20.850392 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:04:20.882199 kernel: SCSI subsystem initialized Jan 13 21:04:20.888191 kernel: Loading iSCSI transport class v2.0-870. Jan 13 21:04:20.895196 kernel: iscsi: registered transport (tcp) Jan 13 21:04:20.908454 kernel: iscsi: registered transport (qla4xxx) Jan 13 21:04:20.908519 kernel: QLogic iSCSI HBA Driver Jan 13 21:04:20.930718 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 13 21:04:20.935307 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 13 21:04:20.950283 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 13 21:04:20.950321 kernel: device-mapper: uevent: version 1.0.3 Jan 13 21:04:20.951341 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com Jan 13 21:04:20.983207 kernel: raid6: avx2x4 gen() 52252 MB/s Jan 13 21:04:20.999239 kernel: raid6: avx2x2 gen() 52497 MB/s Jan 13 21:04:21.016406 kernel: raid6: avx2x1 gen() 44594 MB/s Jan 13 21:04:21.016448 kernel: raid6: using algorithm avx2x2 gen() 52497 MB/s Jan 13 21:04:21.034396 kernel: raid6: .... xor() 31350 MB/s, rmw enabled Jan 13 21:04:21.034416 kernel: raid6: using avx2x2 recovery algorithm Jan 13 21:04:21.048200 kernel: xor: automatically using best checksumming function avx Jan 13 21:04:21.146216 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 13 21:04:21.151467 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 13 21:04:21.157283 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:04:21.164977 systemd-udevd[433]: Using default interface naming scheme 'v255'. Jan 13 21:04:21.167551 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:04:21.173301 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 13 21:04:21.180390 dracut-pre-trigger[438]: rd.md=0: removing MD RAID activation Jan 13 21:04:21.197683 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 21:04:21.201265 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 21:04:21.274810 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:04:21.279374 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 13 21:04:21.286358 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 13 21:04:21.287438 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 21:04:21.287762 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:04:21.288131 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 21:04:21.292297 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 13 21:04:21.301411 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 13 21:04:21.336197 kernel: VMware PVSCSI driver - version 1.0.7.0-k Jan 13 21:04:21.339236 kernel: vmw_pvscsi: using 64bit dma Jan 13 21:04:21.341192 kernel: vmw_pvscsi: max_id: 16 Jan 13 21:04:21.341209 kernel: vmw_pvscsi: setting ring_pages to 8 Jan 13 21:04:21.349215 kernel: vmw_pvscsi: enabling reqCallThreshold Jan 13 21:04:21.349246 kernel: vmw_pvscsi: driver-based request coalescing enabled Jan 13 21:04:21.349255 kernel: vmw_pvscsi: using MSI-X Jan 13 21:04:21.349262 kernel: scsi host0: VMware PVSCSI storage adapter rev 2, req/cmp/msg rings: 8/8/1 pages, cmd_per_lun=254 Jan 13 21:04:21.358908 kernel: vmw_pvscsi 0000:03:00.0: VMware PVSCSI rev 2 host #0 Jan 13 21:04:21.360994 kernel: scsi 0:0:0:0: Direct-Access VMware Virtual disk 2.0 PQ: 0 ANSI: 6 Jan 13 21:04:21.361074 kernel: VMware vmxnet3 virtual NIC driver - version 1.7.0.0-k-NAPI Jan 13 21:04:21.363241 kernel: vmxnet3 0000:0b:00.0: # of Tx queues : 2, # of Rx queues : 2 Jan 13 21:04:21.369159 kernel: libata version 3.00 loaded. Jan 13 21:04:21.369170 kernel: ata_piix 0000:00:07.1: version 2.13 Jan 13 21:04:21.375762 kernel: vmxnet3 0000:0b:00.0 eth0: NIC Link is Up 10000 Mbps Jan 13 21:04:21.375837 kernel: scsi host1: ata_piix Jan 13 21:04:21.375902 kernel: scsi host2: ata_piix Jan 13 21:04:21.375966 kernel: ata1: PATA max UDMA/33 cmd 0x1f0 ctl 0x3f6 bmdma 0x1060 irq 14 Jan 13 21:04:21.375974 kernel: ata2: PATA max UDMA/33 cmd 0x170 ctl 0x376 bmdma 0x1068 irq 15 Jan 13 21:04:21.377292 kernel: cryptd: max_cpu_qlen set to 1000 Jan 13 21:04:21.381191 kernel: vmxnet3 0000:0b:00.0 ens192: renamed from eth0 Jan 13 21:04:21.385176 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 21:04:21.385217 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:21.385628 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:04:21.385858 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 21:04:21.385884 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:21.386100 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:04:21.394314 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:04:21.407159 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:21.411398 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 13 21:04:21.423762 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:21.546214 kernel: ata2.00: ATAPI: VMware Virtual IDE CDROM Drive, 00000001, max UDMA/33 Jan 13 21:04:21.552199 kernel: scsi 2:0:0:0: CD-ROM NECVMWar VMware IDE CDR10 1.00 PQ: 0 ANSI: 5 Jan 13 21:04:21.563399 kernel: AVX2 version of gcm_enc/dec engaged. Jan 13 21:04:21.563437 kernel: AES CTR mode by8 optimization enabled Jan 13 21:04:21.573418 kernel: sd 0:0:0:0: [sda] 17805312 512-byte logical blocks: (9.12 GB/8.49 GiB) Jan 13 21:04:21.592080 kernel: sd 0:0:0:0: [sda] Write Protect is off Jan 13 21:04:21.592384 kernel: sd 0:0:0:0: [sda] Mode Sense: 31 00 00 00 Jan 13 21:04:21.592722 kernel: sd 0:0:0:0: [sda] Cache data unavailable Jan 13 21:04:21.592811 kernel: sd 0:0:0:0: [sda] Assuming drive cache: write through Jan 13 21:04:21.592917 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 1x/1x writer dvd-ram cd/rw xa/form2 cdda tray Jan 13 21:04:21.593024 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 13 21:04:21.593035 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Jan 13 21:04:21.593115 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:21.593124 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jan 13 21:04:21.628194 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (496) Jan 13 21:04:21.628233 kernel: BTRFS: device fsid 5e7921ba-229a-48a0-bc77-9b30aaa34aeb devid 1 transid 36 /dev/sda3 scanned by (udev-worker) (499) Jan 13 21:04:21.628271 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - Virtual_disk ROOT. Jan 13 21:04:21.633865 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - Virtual_disk EFI-SYSTEM. Jan 13 21:04:21.636994 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 21:04:21.639643 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - Virtual_disk USR-A. Jan 13 21:04:21.639794 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - Virtual_disk USR-A. Jan 13 21:04:21.647278 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 13 21:04:21.693213 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:21.698209 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:22.700199 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 13 21:04:22.700641 disk-uuid[595]: The operation has completed successfully. Jan 13 21:04:22.734875 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 13 21:04:22.735206 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 13 21:04:22.740263 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jan 13 21:04:22.742373 sh[611]: Success Jan 13 21:04:22.750200 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Jan 13 21:04:22.811483 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jan 13 21:04:22.812545 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jan 13 21:04:22.812878 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jan 13 21:04:22.827906 kernel: BTRFS info (device dm-0): first mount of filesystem 5e7921ba-229a-48a0-bc77-9b30aaa34aeb Jan 13 21:04:22.827942 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:22.827951 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead Jan 13 21:04:22.828994 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 13 21:04:22.830482 kernel: BTRFS info (device dm-0): using free space tree Jan 13 21:04:22.837212 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 13 21:04:22.839490 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jan 13 21:04:22.847291 systemd[1]: Starting afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments... Jan 13 21:04:22.848559 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 13 21:04:22.870206 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:22.872199 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:22.872219 kernel: BTRFS info (device sda6): using free space tree Jan 13 21:04:22.889398 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 21:04:22.893803 systemd[1]: mnt-oem.mount: Deactivated successfully. Jan 13 21:04:22.895250 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:22.897427 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 13 21:04:22.901682 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 13 21:04:22.913030 systemd[1]: Finished afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 21:04:22.919542 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 13 21:04:22.983142 ignition[671]: Ignition 2.20.0 Jan 13 21:04:22.983283 ignition[671]: Stage: fetch-offline Jan 13 21:04:22.983165 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 21:04:22.983302 ignition[671]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:22.983307 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:22.983363 ignition[671]: parsed url from cmdline: "" Jan 13 21:04:22.983365 ignition[671]: no config URL provided Jan 13 21:04:22.983368 ignition[671]: reading system config file "/usr/lib/ignition/user.ign" Jan 13 21:04:22.983373 ignition[671]: no config at "/usr/lib/ignition/user.ign" Jan 13 21:04:22.983717 ignition[671]: config successfully fetched Jan 13 21:04:22.983734 ignition[671]: parsing config with SHA512: 9afce9029d2d8820adcac0b0a5955968a6e94aa2ce22de805e79d3bbcf06f1dced2f1348b154596a909187013da21f53eaf1a837169882392d24d6252a22fc5b Jan 13 21:04:22.987683 unknown[671]: fetched base config from "system" Jan 13 21:04:22.987950 ignition[671]: fetch-offline: fetch-offline passed Jan 13 21:04:22.987688 unknown[671]: fetched user config from "vmware" Jan 13 21:04:22.987995 ignition[671]: Ignition finished successfully Jan 13 21:04:22.992848 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 21:04:22.993100 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 21:04:23.004644 systemd-networkd[804]: lo: Link UP Jan 13 21:04:23.004650 systemd-networkd[804]: lo: Gained carrier Jan 13 21:04:23.005325 systemd-networkd[804]: Enumeration completed Jan 13 21:04:23.005373 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 21:04:23.005515 systemd[1]: Reached target network.target - Network. Jan 13 21:04:23.005763 systemd-networkd[804]: ens192: Configuring with /etc/systemd/network/10-dracut-cmdline-99.network. Jan 13 21:04:23.006378 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Jan 13 21:04:23.009722 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 21:04:23.009842 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 21:04:23.010032 systemd-networkd[804]: ens192: Link UP Jan 13 21:04:23.010037 systemd-networkd[804]: ens192: Gained carrier Jan 13 21:04:23.010506 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 13 21:04:23.018881 ignition[807]: Ignition 2.20.0 Jan 13 21:04:23.018889 ignition[807]: Stage: kargs Jan 13 21:04:23.018983 ignition[807]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.018989 ignition[807]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.019521 ignition[807]: kargs: kargs passed Jan 13 21:04:23.019549 ignition[807]: Ignition finished successfully Jan 13 21:04:23.020616 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 13 21:04:23.025298 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 13 21:04:23.032432 ignition[814]: Ignition 2.20.0 Jan 13 21:04:23.032439 ignition[814]: Stage: disks Jan 13 21:04:23.032540 ignition[814]: no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.032546 ignition[814]: no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.033126 ignition[814]: disks: disks passed Jan 13 21:04:23.033155 ignition[814]: Ignition finished successfully Jan 13 21:04:23.033778 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 13 21:04:23.034293 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 13 21:04:23.034528 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 21:04:23.034761 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 21:04:23.034940 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 21:04:23.035153 systemd[1]: Reached target basic.target - Basic System. Jan 13 21:04:23.042305 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 13 21:04:23.052375 systemd-fsck[822]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks Jan 13 21:04:23.053479 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 13 21:04:23.057261 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 13 21:04:23.111375 kernel: EXT4-fs (sda9): mounted filesystem 84bcd1b2-5573-4e91-8fd5-f97782397085 r/w with ordered data mode. Quota mode: none. Jan 13 21:04:23.111711 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 13 21:04:23.112103 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 13 21:04:23.124288 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 21:04:23.127008 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 13 21:04:23.127327 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Jan 13 21:04:23.127353 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 13 21:04:23.127366 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 21:04:23.130142 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 13 21:04:23.131507 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 13 21:04:23.133195 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (830) Jan 13 21:04:23.137337 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:23.137362 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:23.137371 kernel: BTRFS info (device sda6): using free space tree Jan 13 21:04:23.143194 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 21:04:23.144129 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 21:04:23.161995 initrd-setup-root[854]: cut: /sysroot/etc/passwd: No such file or directory Jan 13 21:04:23.182009 initrd-setup-root[861]: cut: /sysroot/etc/group: No such file or directory Jan 13 21:04:23.190878 initrd-setup-root[868]: cut: /sysroot/etc/shadow: No such file or directory Jan 13 21:04:23.192801 initrd-setup-root[875]: cut: /sysroot/etc/gshadow: No such file or directory Jan 13 21:04:23.248860 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 13 21:04:23.254264 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 13 21:04:23.256706 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 13 21:04:23.261199 kernel: BTRFS info (device sda6): last unmount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:23.272317 ignition[942]: INFO : Ignition 2.20.0 Jan 13 21:04:23.272317 ignition[942]: INFO : Stage: mount Jan 13 21:04:23.272317 ignition[942]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.272317 ignition[942]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.273383 ignition[942]: INFO : mount: mount passed Jan 13 21:04:23.273383 ignition[942]: INFO : Ignition finished successfully Jan 13 21:04:23.273636 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 13 21:04:23.279308 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 13 21:04:23.279536 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 13 21:04:23.826420 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 13 21:04:23.831316 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 13 21:04:23.839224 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (955) Jan 13 21:04:23.842382 kernel: BTRFS info (device sda6): first mount of filesystem 1066b41d-395d-4ccb-b5ae-be36ea0fc11e Jan 13 21:04:23.842400 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jan 13 21:04:23.842409 kernel: BTRFS info (device sda6): using free space tree Jan 13 21:04:23.848192 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 13 21:04:23.848423 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 13 21:04:23.862278 ignition[972]: INFO : Ignition 2.20.0 Jan 13 21:04:23.862278 ignition[972]: INFO : Stage: files Jan 13 21:04:23.863216 ignition[972]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:23.863216 ignition[972]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:23.863216 ignition[972]: DEBUG : files: compiled without relabeling support, skipping Jan 13 21:04:23.863653 ignition[972]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 13 21:04:23.863653 ignition[972]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 13 21:04:23.865616 ignition[972]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 13 21:04:23.865755 ignition[972]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 13 21:04:23.865887 ignition[972]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 13 21:04:23.865827 unknown[972]: wrote ssh authorized keys file for user: core Jan 13 21:04:23.867634 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Jan 13 21:04:23.867793 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Jan 13 21:04:23.867793 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 21:04:23.867793 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Jan 13 21:04:23.906529 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Jan 13 21:04:24.037968 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Jan 13 21:04:24.037968 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 21:04:24.038418 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.039524 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.29.2-x86-64.raw: attempt #1 Jan 13 21:04:24.543332 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Jan 13 21:04:24.839763 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.29.2-x86-64.raw" Jan 13 21:04:24.840039 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 21:04:24.840039 ignition[972]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/etc/systemd/network/00-vmware.network" Jan 13 21:04:24.840039 ignition[972]: INFO : files: op(d): [started] processing unit "containerd.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(d): op(e): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(d): [finished] processing unit "containerd.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): [started] processing unit "prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): op(10): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): op(10): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(f): [finished] processing unit "prepare-helm.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): [started] processing unit "coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): op(12): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): op(12): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(11): [finished] processing unit "coreos-metadata.service" Jan 13 21:04:24.840536 ignition[972]: INFO : files: op(13): [started] setting preset to disabled for "coreos-metadata.service" Jan 13 21:04:24.878315 ignition[972]: INFO : files: op(13): op(14): [started] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(13): op(14): [finished] removing enablement symlink(s) for "coreos-metadata.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(13): [finished] setting preset to disabled for "coreos-metadata.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(15): [started] setting preset to enabled for "prepare-helm.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: op(15): [finished] setting preset to enabled for "prepare-helm.service" Jan 13 21:04:24.881125 ignition[972]: INFO : files: createResultFile: createFiles: op(16): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 13 21:04:24.881125 ignition[972]: INFO : files: createResultFile: createFiles: op(16): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 13 21:04:24.881125 ignition[972]: INFO : files: files passed Jan 13 21:04:24.881125 ignition[972]: INFO : Ignition finished successfully Jan 13 21:04:24.882765 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 13 21:04:24.887299 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 13 21:04:24.888885 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 13 21:04:24.889479 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 13 21:04:24.889667 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 13 21:04:24.894891 initrd-setup-root-after-ignition[1002]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:04:24.894891 initrd-setup-root-after-ignition[1002]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:04:24.895805 initrd-setup-root-after-ignition[1006]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 13 21:04:24.896487 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 21:04:24.896825 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 13 21:04:24.900271 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 13 21:04:24.916407 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 13 21:04:24.916467 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 13 21:04:24.916749 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 13 21:04:24.916870 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 13 21:04:24.917064 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 13 21:04:24.917514 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 13 21:04:24.926760 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 21:04:24.930308 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 13 21:04:24.935918 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:04:24.936270 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:04:24.936535 systemd[1]: Stopped target timers.target - Timer Units. Jan 13 21:04:24.936836 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 13 21:04:24.936935 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 13 21:04:24.937593 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 13 21:04:24.937862 systemd[1]: Stopped target basic.target - Basic System. Jan 13 21:04:24.938081 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 13 21:04:24.938372 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 13 21:04:24.938616 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 13 21:04:24.938889 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 13 21:04:24.939167 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 13 21:04:24.939438 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 13 21:04:24.939712 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 13 21:04:24.939934 systemd[1]: Stopped target swap.target - Swaps. Jan 13 21:04:24.940138 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 13 21:04:24.940333 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 13 21:04:24.940683 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:04:24.940960 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:04:24.941197 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 13 21:04:24.941359 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:04:24.941624 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 13 21:04:24.941688 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 13 21:04:24.942096 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 13 21:04:24.942163 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 13 21:04:24.942580 systemd[1]: Stopped target paths.target - Path Units. Jan 13 21:04:24.942795 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 13 21:04:24.942967 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:04:24.943282 systemd[1]: Stopped target slices.target - Slice Units. Jan 13 21:04:24.943516 systemd[1]: Stopped target sockets.target - Socket Units. Jan 13 21:04:24.943748 systemd[1]: iscsid.socket: Deactivated successfully. Jan 13 21:04:24.943801 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 13 21:04:24.944142 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 13 21:04:24.944205 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 13 21:04:24.944467 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 13 21:04:24.944536 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 13 21:04:24.945051 systemd[1]: ignition-files.service: Deactivated successfully. Jan 13 21:04:24.945115 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 13 21:04:24.953294 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 13 21:04:24.955303 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 13 21:04:24.955405 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 13 21:04:24.955475 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:04:24.955701 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 13 21:04:24.955757 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 13 21:04:24.957583 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 13 21:04:24.957730 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 13 21:04:24.964943 ignition[1027]: INFO : Ignition 2.20.0 Jan 13 21:04:24.964943 ignition[1027]: INFO : Stage: umount Jan 13 21:04:24.965302 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 13 21:04:24.965302 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/vmware" Jan 13 21:04:24.966140 ignition[1027]: INFO : umount: umount passed Jan 13 21:04:24.966266 ignition[1027]: INFO : Ignition finished successfully Jan 13 21:04:24.967779 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 13 21:04:24.968098 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 13 21:04:24.968148 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 13 21:04:24.968848 systemd[1]: Stopped target network.target - Network. Jan 13 21:04:24.969071 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 13 21:04:24.969208 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 13 21:04:24.969440 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 13 21:04:24.969462 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 13 21:04:24.969691 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 13 21:04:24.969711 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 13 21:04:24.970216 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 13 21:04:24.970242 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 13 21:04:24.970417 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 13 21:04:24.970561 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 13 21:04:24.975119 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 13 21:04:24.975206 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 13 21:04:24.975500 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 13 21:04:24.975527 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:04:24.979250 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 13 21:04:24.979372 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 13 21:04:24.979402 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 13 21:04:24.980304 systemd[1]: afterburn-network-kargs.service: Deactivated successfully. Jan 13 21:04:24.980328 systemd[1]: Stopped afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments. Jan 13 21:04:24.980522 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:04:24.981034 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 13 21:04:24.981267 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 13 21:04:24.983917 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 13 21:04:24.983960 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:04:24.984894 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 13 21:04:24.985019 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 13 21:04:24.985343 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 13 21:04:24.985368 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:04:24.987415 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 13 21:04:24.987468 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 13 21:04:24.996516 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 13 21:04:24.996593 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:04:24.996887 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 13 21:04:24.996913 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 13 21:04:24.997120 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 13 21:04:24.997136 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:04:24.997447 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 13 21:04:24.997470 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 13 21:04:24.997733 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 13 21:04:24.997754 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 13 21:04:24.998033 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 13 21:04:24.998056 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 13 21:04:25.001310 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 13 21:04:25.001417 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 13 21:04:25.001442 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:04:25.001566 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jan 13 21:04:25.001589 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:04:25.001708 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 13 21:04:25.001729 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:04:25.001846 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 13 21:04:25.001867 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:25.004255 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 13 21:04:25.004321 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 13 21:04:25.016774 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 13 21:04:25.016833 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 13 21:04:25.017085 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 13 21:04:25.017221 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 13 21:04:25.017247 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 13 21:04:25.020292 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 13 21:04:25.036861 systemd[1]: Switching root. Jan 13 21:04:25.061073 systemd-journald[217]: Journal stopped Jan 13 21:04:26.213378 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Jan 13 21:04:26.213398 kernel: SELinux: policy capability network_peer_controls=1 Jan 13 21:04:26.213407 kernel: SELinux: policy capability open_perms=1 Jan 13 21:04:26.213412 kernel: SELinux: policy capability extended_socket_class=1 Jan 13 21:04:26.213417 kernel: SELinux: policy capability always_check_network=0 Jan 13 21:04:26.213422 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 13 21:04:26.213429 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 13 21:04:26.213435 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 13 21:04:26.213440 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 13 21:04:26.213446 systemd[1]: Successfully loaded SELinux policy in 51.996ms. Jan 13 21:04:26.213452 kernel: audit: type=1403 audit(1736802265.667:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jan 13 21:04:26.213458 systemd[1]: Relabeled /dev, /dev/shm, /run, /sys/fs/cgroup in 7.557ms. Jan 13 21:04:26.213465 systemd[1]: systemd 255 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT default-hierarchy=unified) Jan 13 21:04:26.213473 systemd[1]: Detected virtualization vmware. Jan 13 21:04:26.213479 systemd[1]: Detected architecture x86-64. Jan 13 21:04:26.213486 systemd[1]: Detected first boot. Jan 13 21:04:26.213493 systemd[1]: Initializing machine ID from random generator. Jan 13 21:04:26.213500 zram_generator::config[1087]: No configuration found. Jan 13 21:04:26.213507 systemd[1]: Populated /etc with preset unit settings. Jan 13 21:04:26.213514 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 21:04:26.213521 systemd[1]: COREOS_CUSTOM_PUBLIC_IPV4=$(ip addr show ens192 | grep -v "inet 10." | grep -Po "inet \K[\d.]+")" > ${OUTPUT}" Jan 13 21:04:26.213527 systemd[1]: Queued start job for default target multi-user.target. Jan 13 21:04:26.213533 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 13 21:04:26.213539 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 13 21:04:26.213546 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 13 21:04:26.213553 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 13 21:04:26.213559 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 13 21:04:26.213565 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 13 21:04:26.213572 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 13 21:04:26.213578 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 13 21:04:26.213584 systemd[1]: Created slice user.slice - User and Session Slice. Jan 13 21:04:26.213592 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 13 21:04:26.213598 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 13 21:04:26.213605 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 13 21:04:26.213611 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 13 21:04:26.213618 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 13 21:04:26.213624 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 13 21:04:26.213631 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jan 13 21:04:26.213637 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 13 21:04:26.213645 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 13 21:04:26.213652 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 13 21:04:26.213660 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 13 21:04:26.213666 systemd[1]: Reached target slices.target - Slice Units. Jan 13 21:04:26.213672 systemd[1]: Reached target swap.target - Swaps. Jan 13 21:04:26.213679 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 13 21:04:26.213686 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 13 21:04:26.213692 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 13 21:04:26.213700 systemd[1]: Listening on systemd-journald.socket - Journal Socket. Jan 13 21:04:26.213707 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 13 21:04:26.213713 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 13 21:04:26.213720 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 13 21:04:26.213727 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 13 21:04:26.213735 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 13 21:04:26.213741 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 13 21:04:26.213748 systemd[1]: Mounting media.mount - External Media Directory... Jan 13 21:04:26.213755 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:26.213761 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 13 21:04:26.213768 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 13 21:04:26.213775 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 13 21:04:26.213782 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 13 21:04:26.213789 systemd[1]: Starting ignition-delete-config.service - Ignition (delete config)... Jan 13 21:04:26.213796 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 13 21:04:26.213803 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 13 21:04:26.213809 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 21:04:26.213816 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 21:04:26.213822 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 21:04:26.213829 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 13 21:04:26.213836 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 21:04:26.213842 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 13 21:04:26.213850 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Jan 13 21:04:26.213857 systemd[1]: systemd-journald.service: (This warning is only shown for the first unit using IP firewalling.) Jan 13 21:04:26.213863 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 13 21:04:26.213870 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 13 21:04:26.213876 kernel: fuse: init (API version 7.39) Jan 13 21:04:26.213882 kernel: loop: module loaded Jan 13 21:04:26.213888 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 13 21:04:26.213905 systemd-journald[1197]: Collecting audit messages is disabled. Jan 13 21:04:26.213921 systemd-journald[1197]: Journal started Jan 13 21:04:26.213936 systemd-journald[1197]: Runtime Journal (/run/log/journal/8c52a5ad72c84317be7cac7576a47ebe) is 4.8M, max 38.7M, 33.8M free. Jan 13 21:04:26.216549 jq[1165]: true Jan 13 21:04:26.220597 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 13 21:04:26.224098 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 13 21:04:26.224122 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:26.228744 kernel: ACPI: bus type drm_connector registered Jan 13 21:04:26.228769 systemd[1]: Started systemd-journald.service - Journal Service. Jan 13 21:04:26.230429 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 13 21:04:26.230564 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 13 21:04:26.230692 systemd[1]: Mounted media.mount - External Media Directory. Jan 13 21:04:26.230815 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 13 21:04:26.230948 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 13 21:04:26.231075 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 13 21:04:26.231353 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 13 21:04:26.235526 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 13 21:04:26.235822 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 13 21:04:26.235907 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 13 21:04:26.236119 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 21:04:26.236209 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 21:04:26.236413 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 21:04:26.236488 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 21:04:26.236726 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 21:04:26.236807 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 21:04:26.237028 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 13 21:04:26.237105 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 13 21:04:26.237320 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 21:04:26.237401 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 21:04:26.237637 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 13 21:04:26.237869 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 13 21:04:26.238108 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 13 21:04:26.245624 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 13 21:04:26.248690 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 13 21:04:26.250723 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 13 21:04:26.250834 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 13 21:04:26.252079 jq[1219]: true Jan 13 21:04:26.259479 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 13 21:04:26.261009 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 13 21:04:26.262706 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 21:04:26.271325 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 13 21:04:26.272488 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 21:04:26.276408 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 13 21:04:26.280379 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 13 21:04:26.282902 systemd-journald[1197]: Time spent on flushing to /var/log/journal/8c52a5ad72c84317be7cac7576a47ebe is 36.600ms for 1820 entries. Jan 13 21:04:26.282902 systemd-journald[1197]: System Journal (/var/log/journal/8c52a5ad72c84317be7cac7576a47ebe) is 8.0M, max 584.8M, 576.8M free. Jan 13 21:04:26.327441 systemd-journald[1197]: Received client request to flush runtime journal. Jan 13 21:04:26.293445 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 13 21:04:26.293599 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 13 21:04:26.293856 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 13 21:04:26.297521 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 13 21:04:26.330534 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 13 21:04:26.341041 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 13 21:04:26.345156 systemd-tmpfiles[1248]: ACLs are not supported, ignoring. Jan 13 21:04:26.345337 systemd-tmpfiles[1248]: ACLs are not supported, ignoring. Jan 13 21:04:26.359012 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 13 21:04:26.364349 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 13 21:04:26.395529 ignition[1249]: Ignition 2.20.0 Jan 13 21:04:26.395716 ignition[1249]: deleting config from guestinfo properties Jan 13 21:04:26.403779 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 13 21:04:26.409480 ignition[1249]: Successfully deleted config Jan 13 21:04:26.410316 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 13 21:04:26.411456 systemd[1]: Finished ignition-delete-config.service - Ignition (delete config). Jan 13 21:04:26.422721 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 13 21:04:26.428344 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... Jan 13 21:04:26.429593 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Jan 13 21:04:26.429604 systemd-tmpfiles[1275]: ACLs are not supported, ignoring. Jan 13 21:04:26.436258 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 13 21:04:26.437764 udevadm[1281]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Jan 13 21:04:26.758613 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 13 21:04:26.764282 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 13 21:04:26.778348 systemd-udevd[1285]: Using default interface naming scheme 'v255'. Jan 13 21:04:26.793067 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 13 21:04:26.801316 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 13 21:04:26.812265 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 13 21:04:26.835877 systemd[1]: Found device dev-ttyS0.device - /dev/ttyS0. Jan 13 21:04:26.840283 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 13 21:04:26.879196 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1287) Jan 13 21:04:26.898200 systemd-networkd[1293]: lo: Link UP Jan 13 21:04:26.898214 systemd-networkd[1293]: lo: Gained carrier Jan 13 21:04:26.898909 systemd-networkd[1293]: Enumeration completed Jan 13 21:04:26.898975 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 13 21:04:26.899121 systemd-networkd[1293]: ens192: Configuring with /etc/systemd/network/00-vmware.network. Jan 13 21:04:26.902264 kernel: vmxnet3 0000:0b:00.0 ens192: intr type 3, mode 0, 3 vectors allocated Jan 13 21:04:26.902396 kernel: vmxnet3 0000:0b:00.0 ens192: NIC Link is Up 10000 Mbps Jan 13 21:04:26.902502 systemd-networkd[1293]: ens192: Link UP Jan 13 21:04:26.902615 systemd-networkd[1293]: ens192: Gained carrier Jan 13 21:04:26.903300 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 13 21:04:26.925226 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Jan 13 21:04:26.933208 kernel: ACPI: button: Power Button [PWRF] Jan 13 21:04:26.939617 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - Virtual_disk OEM. Jan 13 21:04:26.970195 kernel: piix4_smbus 0000:00:07.3: SMBus Host Controller not enabled! Jan 13 21:04:26.973190 kernel: vmw_vmci 0000:00:07.7: Using capabilities 0xc Jan 13 21:04:26.975196 kernel: Guest personality initialized and is active Jan 13 21:04:26.976211 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jan 13 21:04:26.976229 kernel: Initialized host personality Jan 13 21:04:26.990194 kernel: input: ImPS/2 Generic Wheel Mouse as /devices/platform/i8042/serio1/input/input3 Jan 13 21:04:27.002240 (udev-worker)[1302]: id: Truncating stdout of 'dmi_memory_id' up to 16384 byte. Jan 13 21:04:27.004195 kernel: mousedev: PS/2 mouse device common for all mice Jan 13 21:04:27.011329 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 13 21:04:27.018537 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. Jan 13 21:04:27.025360 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... Jan 13 21:04:27.034089 lvm[1328]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 21:04:27.052219 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. Jan 13 21:04:27.052675 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 13 21:04:27.061296 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... Jan 13 21:04:27.061615 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 13 21:04:27.063841 lvm[1332]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Jan 13 21:04:27.084212 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. Jan 13 21:04:27.084450 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 13 21:04:27.084565 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 13 21:04:27.084595 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 13 21:04:27.084689 systemd[1]: Reached target machines.target - Containers. Jan 13 21:04:27.085406 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management (Varlink). Jan 13 21:04:27.089330 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 13 21:04:27.092271 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 13 21:04:27.092483 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 21:04:27.093291 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 13 21:04:27.096881 systemd[1]: Starting systemd-machine-id-commit.service - Commit a transient machine-id on disk... Jan 13 21:04:27.100384 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 13 21:04:27.100934 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 13 21:04:27.114199 kernel: loop0: detected capacity change from 0 to 211296 Jan 13 21:04:27.116215 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 13 21:04:27.129080 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 13 21:04:27.130029 systemd[1]: Finished systemd-machine-id-commit.service - Commit a transient machine-id on disk. Jan 13 21:04:27.156197 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 13 21:04:27.193194 kernel: loop1: detected capacity change from 0 to 140992 Jan 13 21:04:27.237197 kernel: loop2: detected capacity change from 0 to 138184 Jan 13 21:04:27.281202 kernel: loop3: detected capacity change from 0 to 2944 Jan 13 21:04:27.312207 kernel: loop4: detected capacity change from 0 to 211296 Jan 13 21:04:27.327341 kernel: loop5: detected capacity change from 0 to 140992 Jan 13 21:04:27.352197 kernel: loop6: detected capacity change from 0 to 138184 Jan 13 21:04:27.372198 kernel: loop7: detected capacity change from 0 to 2944 Jan 13 21:04:27.385704 (sd-merge)[1356]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-vmware'. Jan 13 21:04:27.386401 (sd-merge)[1356]: Merged extensions into '/usr'. Jan 13 21:04:27.389089 systemd[1]: Reloading requested from client PID 1343 ('systemd-sysext') (unit systemd-sysext.service)... Jan 13 21:04:27.389098 systemd[1]: Reloading... Jan 13 21:04:27.430232 zram_generator::config[1381]: No configuration found. Jan 13 21:04:27.499207 ldconfig[1339]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 13 21:04:27.521478 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 21:04:27.536845 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 21:04:27.571827 systemd[1]: Reloading finished in 182 ms. Jan 13 21:04:27.586368 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 13 21:04:27.586673 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 13 21:04:27.592278 systemd[1]: Starting ensure-sysext.service... Jan 13 21:04:27.595272 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 13 21:04:27.597095 systemd[1]: Reloading requested from client PID 1447 ('systemctl') (unit ensure-sysext.service)... Jan 13 21:04:27.597104 systemd[1]: Reloading... Jan 13 21:04:27.608955 systemd-tmpfiles[1448]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 13 21:04:27.609157 systemd-tmpfiles[1448]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jan 13 21:04:27.609686 systemd-tmpfiles[1448]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jan 13 21:04:27.609853 systemd-tmpfiles[1448]: ACLs are not supported, ignoring. Jan 13 21:04:27.609898 systemd-tmpfiles[1448]: ACLs are not supported, ignoring. Jan 13 21:04:27.611476 systemd-tmpfiles[1448]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 21:04:27.611483 systemd-tmpfiles[1448]: Skipping /boot Jan 13 21:04:27.617335 systemd-tmpfiles[1448]: Detected autofs mount point /boot during canonicalization of boot. Jan 13 21:04:27.617343 systemd-tmpfiles[1448]: Skipping /boot Jan 13 21:04:27.642202 zram_generator::config[1477]: No configuration found. Jan 13 21:04:27.707967 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 21:04:27.724802 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 21:04:27.760713 systemd[1]: Reloading finished in 163 ms. Jan 13 21:04:27.785544 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 13 21:04:27.788903 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 21:04:27.793275 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 13 21:04:27.795362 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 13 21:04:27.798652 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 13 21:04:27.804384 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 13 21:04:27.808759 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:27.810395 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 21:04:27.813347 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 21:04:27.818175 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 21:04:27.818341 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 21:04:27.818406 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:27.819435 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 21:04:27.819526 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 21:04:27.820281 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 21:04:27.820362 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 21:04:27.827433 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:27.832373 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 21:04:27.833341 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 21:04:27.833497 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 21:04:27.833564 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:27.834095 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 13 21:04:27.834459 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 21:04:27.834544 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 21:04:27.843472 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 21:04:27.843571 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 21:04:27.843934 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 21:04:27.844895 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 21:04:27.847107 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:27.855465 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 13 21:04:27.858414 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 13 21:04:27.860297 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 13 21:04:27.861018 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 13 21:04:27.861205 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 13 21:04:27.861297 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jan 13 21:04:27.861809 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 13 21:04:27.861896 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 13 21:04:27.869367 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 13 21:04:27.870435 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 13 21:04:27.872440 systemd[1]: Finished ensure-sysext.service. Jan 13 21:04:27.879986 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 13 21:04:27.880299 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 13 21:04:27.880387 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 13 21:04:27.880644 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 13 21:04:27.882512 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 13 21:04:27.884489 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 13 21:04:27.884525 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 13 21:04:27.886307 systemd-resolved[1547]: Positive Trust Anchors: Jan 13 21:04:27.886573 systemd-resolved[1547]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 13 21:04:27.886634 systemd-resolved[1547]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 13 21:04:27.889874 systemd-resolved[1547]: Defaulting to hostname 'linux'. Jan 13 21:04:27.893430 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 13 21:04:27.895466 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 13 21:04:27.895699 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 13 21:04:27.895981 systemd[1]: Reached target network.target - Network. Jan 13 21:04:27.896079 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 13 21:04:27.901435 augenrules[1602]: No rules Jan 13 21:04:27.902585 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 21:04:27.904305 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 21:04:27.907022 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 13 21:04:27.932094 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 13 21:04:27.932746 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 13 21:04:27.933341 systemd[1]: Reached target time-set.target - System Time Set. Jan 13 21:04:27.933465 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 13 21:04:27.933483 systemd[1]: Reached target sysinit.target - System Initialization. Jan 13 21:04:27.933664 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 13 21:04:27.933799 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 13 21:04:27.934003 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 13 21:04:27.934171 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 13 21:04:27.934299 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 13 21:04:27.934420 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 13 21:04:27.934434 systemd[1]: Reached target paths.target - Path Units. Jan 13 21:04:27.934535 systemd[1]: Reached target timers.target - Timer Units. Jan 13 21:04:27.934916 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 13 21:04:27.936395 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 13 21:04:27.937373 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 13 21:04:27.943865 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 13 21:04:27.944011 systemd[1]: Reached target sockets.target - Socket Units. Jan 13 21:04:27.944116 systemd[1]: Reached target basic.target - Basic System. Jan 13 21:04:27.944311 systemd[1]: System is tainted: cgroupsv1 Jan 13 21:04:27.944337 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 13 21:04:27.944352 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 13 21:04:27.946159 systemd[1]: Starting containerd.service - containerd container runtime... Jan 13 21:04:27.949066 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 13 21:04:27.951771 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 13 21:04:27.958450 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 13 21:04:27.958660 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 13 21:04:27.960407 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 13 21:04:27.965452 jq[1619]: false Jan 13 21:04:27.965692 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 13 21:04:27.967789 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 13 21:04:27.971797 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 13 21:04:27.978013 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 13 21:04:27.978375 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 13 21:04:27.985313 systemd[1]: Starting update-engine.service - Update Engine... Jan 13 21:04:27.991811 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 13 21:04:27.993130 systemd[1]: Starting vgauthd.service - VGAuth Service for open-vm-tools... Jan 13 21:04:27.998172 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 13 21:04:27.998338 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 13 21:04:28.002922 extend-filesystems[1620]: Found loop4 Jan 13 21:04:28.003416 extend-filesystems[1620]: Found loop5 Jan 13 21:04:28.003575 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 13 21:04:28.005346 extend-filesystems[1620]: Found loop6 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found loop7 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda1 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda2 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda3 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found usr Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda4 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda6 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda7 Jan 13 21:04:28.005346 extend-filesystems[1620]: Found sda9 Jan 13 21:04:28.005346 extend-filesystems[1620]: Checking size of /dev/sda9 Jan 13 21:04:28.003707 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 13 21:04:28.009487 update_engine[1631]: I20250113 21:04:28.008803 1631 main.cc:92] Flatcar Update Engine starting Jan 13 21:04:28.011132 systemd[1]: motdgen.service: Deactivated successfully. Jan 13 21:04:28.011286 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 13 21:04:28.019140 jq[1636]: true Jan 13 21:04:28.035422 extend-filesystems[1620]: Old size kept for /dev/sda9 Jan 13 21:04:28.035422 extend-filesystems[1620]: Found sr0 Jan 13 21:04:28.038075 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 13 21:04:28.038244 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 13 21:04:28.042561 (ntainerd)[1650]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jan 13 21:04:28.046335 systemd[1]: Started vgauthd.service - VGAuth Service for open-vm-tools. Jan 13 21:04:28.053813 jq[1653]: true Jan 13 21:04:28.052943 systemd[1]: Starting vmtoolsd.service - Service for virtual machines hosted on VMware... Jan 13 21:04:28.063479 dbus-daemon[1617]: [system] SELinux support is enabled Jan 13 21:04:28.064137 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 13 21:04:28.067603 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 13 21:04:28.067625 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 13 21:04:28.067787 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 13 21:04:28.067797 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 13 21:05:40.549552 systemd-resolved[1547]: Clock change detected. Flushing caches. Jan 13 21:05:40.549715 systemd-timesyncd[1599]: Contacted time server 147.135.4.214:123 (0.flatcar.pool.ntp.org). Jan 13 21:05:40.549827 systemd-timesyncd[1599]: Initial clock synchronization to Mon 2025-01-13 21:05:40.549521 UTC. Jan 13 21:05:40.553729 systemd[1]: Started update-engine.service - Update Engine. Jan 13 21:05:40.554701 update_engine[1631]: I20250113 21:05:40.554215 1631 update_check_scheduler.cc:74] Next update check in 10m32s Jan 13 21:05:40.554695 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 13 21:05:40.560948 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 13 21:05:40.565305 tar[1642]: linux-amd64/helm Jan 13 21:05:40.579853 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (1290) Jan 13 21:05:40.579574 systemd[1]: Started vmtoolsd.service - Service for virtual machines hosted on VMware. Jan 13 21:05:40.585593 systemd-logind[1629]: Watching system buttons on /dev/input/event1 (Power Button) Jan 13 21:05:40.589861 systemd-logind[1629]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jan 13 21:05:40.590004 systemd-logind[1629]: New seat seat0. Jan 13 21:05:40.591837 systemd[1]: Started systemd-logind.service - User Login Management. Jan 13 21:05:40.596422 unknown[1666]: Pref_Init: Using '/etc/vmware-tools/vgauth.conf' as preferences filepath Jan 13 21:05:40.598568 unknown[1666]: Core dump limit set to -1 Jan 13 21:05:40.609824 kernel: NET: Registered PF_VSOCK protocol family Jan 13 21:05:40.684306 bash[1692]: Updated "/home/core/.ssh/authorized_keys" Jan 13 21:05:40.684167 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 13 21:05:40.686248 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. Jan 13 21:05:40.757469 locksmithd[1677]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 13 21:05:40.778633 sshd_keygen[1649]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 13 21:05:40.804918 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 13 21:05:40.815059 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 13 21:05:40.820727 systemd[1]: issuegen.service: Deactivated successfully. Jan 13 21:05:40.820884 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 13 21:05:40.829084 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 13 21:05:40.844731 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 13 21:05:40.851040 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 13 21:05:40.854528 containerd[1650]: time="2025-01-13T21:05:40.854484774Z" level=info msg="starting containerd" revision=9b2ad7760328148397346d10c7b2004271249db4 version=v1.7.23 Jan 13 21:05:40.855739 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jan 13 21:05:40.856229 systemd[1]: Reached target getty.target - Login Prompts. Jan 13 21:05:40.878180 containerd[1650]: time="2025-01-13T21:05:40.878153404Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879305093Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/6.6.71-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879322549Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879332413Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879420214Z" level=info msg="loading plugin \"io.containerd.warning.v1.deprecations\"..." type=io.containerd.warning.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879430506Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879466114Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.blockfile\"..." error="no scratch file generator: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879473850Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879590620Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879598927Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879606004Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." error="devmapper not configured: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879814 containerd[1650]: time="2025-01-13T21:05:40.879611328Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879985 containerd[1650]: time="2025-01-13T21:05:40.879651160Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:05:40.879985 containerd[1650]: time="2025-01-13T21:05:40.879782376Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Jan 13 21:05:40.880104 containerd[1650]: time="2025-01-13T21:05:40.880094537Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Jan 13 21:05:40.880136 containerd[1650]: time="2025-01-13T21:05:40.880129692Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Jan 13 21:05:40.880205 containerd[1650]: time="2025-01-13T21:05:40.880196100Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Jan 13 21:05:40.880263 containerd[1650]: time="2025-01-13T21:05:40.880256174Z" level=info msg="metadata content store policy set" policy=shared Jan 13 21:05:40.882171 containerd[1650]: time="2025-01-13T21:05:40.882154124Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Jan 13 21:05:40.882251 containerd[1650]: time="2025-01-13T21:05:40.882242801Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Jan 13 21:05:40.882453 containerd[1650]: time="2025-01-13T21:05:40.882298972Z" level=info msg="loading plugin \"io.containerd.lease.v1.manager\"..." type=io.containerd.lease.v1 Jan 13 21:05:40.882453 containerd[1650]: time="2025-01-13T21:05:40.882311445Z" level=info msg="loading plugin \"io.containerd.streaming.v1.manager\"..." type=io.containerd.streaming.v1 Jan 13 21:05:40.882453 containerd[1650]: time="2025-01-13T21:05:40.882334148Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Jan 13 21:05:40.882453 containerd[1650]: time="2025-01-13T21:05:40.882410658Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Jan 13 21:05:40.882827 containerd[1650]: time="2025-01-13T21:05:40.882815825Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Jan 13 21:05:40.882919 containerd[1650]: time="2025-01-13T21:05:40.882910855Z" level=info msg="loading plugin \"io.containerd.runtime.v2.shim\"..." type=io.containerd.runtime.v2 Jan 13 21:05:40.882953 containerd[1650]: time="2025-01-13T21:05:40.882947064Z" level=info msg="loading plugin \"io.containerd.sandbox.store.v1.local\"..." type=io.containerd.sandbox.store.v1 Jan 13 21:05:40.882984 containerd[1650]: time="2025-01-13T21:05:40.882978430Z" level=info msg="loading plugin \"io.containerd.sandbox.controller.v1.local\"..." type=io.containerd.sandbox.controller.v1 Jan 13 21:05:40.883015 containerd[1650]: time="2025-01-13T21:05:40.883009553Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.883045 containerd[1650]: time="2025-01-13T21:05:40.883039120Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883068501Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883078696Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883087231Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883094757Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883107233Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883114265Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883126376Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883133937Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883141026Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883148259Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883155332Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883162433Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883169160Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884015 containerd[1650]: time="2025-01-13T21:05:40.883177224Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883184668Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandbox-controllers\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883197726Z" level=info msg="loading plugin \"io.containerd.grpc.v1.sandboxes\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883204723Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883211265Z" level=info msg="loading plugin \"io.containerd.grpc.v1.streaming\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883218806Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883226506Z" level=info msg="loading plugin \"io.containerd.transfer.v1.local\"..." type=io.containerd.transfer.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883238314Z" level=info msg="loading plugin \"io.containerd.grpc.v1.transfer\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883245695Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883251352Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883280670Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883291724Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.tracing.processor.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883297719Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Jan 13 21:05:40.884211 containerd[1650]: time="2025-01-13T21:05:40.883304282Z" level=info msg="skip loading plugin \"io.containerd.internal.v1.tracing\"..." error="skip plugin: tracing endpoint not configured" type=io.containerd.internal.v1 Jan 13 21:05:40.884378 containerd[1650]: time="2025-01-13T21:05:40.883309336Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884378 containerd[1650]: time="2025-01-13T21:05:40.883316520Z" level=info msg="loading plugin \"io.containerd.nri.v1.nri\"..." type=io.containerd.nri.v1 Jan 13 21:05:40.884378 containerd[1650]: time="2025-01-13T21:05:40.883322413Z" level=info msg="NRI interface is disabled by configuration." Jan 13 21:05:40.884378 containerd[1650]: time="2025-01-13T21:05:40.883328430Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Jan 13 21:05:40.884433 containerd[1650]: time="2025-01-13T21:05:40.883495431Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false PrivilegedWithoutHostDevicesAllDevicesAllowed:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0 Snapshotter: SandboxMode:podsandbox}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreBlockIONotEnabledErrors:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginSetupSerially:false NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.8 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false EnableCDI:false CDISpecDirs:[/etc/cdi /var/run/cdi] ImagePullProgressTimeout:5m0s DrainExecSyncIOTimeout:0s ImagePullWithSyncFs:false IgnoreDeprecationWarnings:[]} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Jan 13 21:05:40.884433 containerd[1650]: time="2025-01-13T21:05:40.883523512Z" level=info msg="Connect containerd service" Jan 13 21:05:40.884433 containerd[1650]: time="2025-01-13T21:05:40.883547305Z" level=info msg="using legacy CRI server" Jan 13 21:05:40.884433 containerd[1650]: time="2025-01-13T21:05:40.883552089Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 13 21:05:40.884433 containerd[1650]: time="2025-01-13T21:05:40.883628055Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Jan 13 21:05:40.884433 containerd[1650]: time="2025-01-13T21:05:40.883914735Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 13 21:05:40.884733 containerd[1650]: time="2025-01-13T21:05:40.884723059Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 13 21:05:40.884788 containerd[1650]: time="2025-01-13T21:05:40.884779522Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 13 21:05:40.884899 containerd[1650]: time="2025-01-13T21:05:40.884884620Z" level=info msg="Start subscribing containerd event" Jan 13 21:05:40.884945 containerd[1650]: time="2025-01-13T21:05:40.884938401Z" level=info msg="Start recovering state" Jan 13 21:05:40.885006 containerd[1650]: time="2025-01-13T21:05:40.884999301Z" level=info msg="Start event monitor" Jan 13 21:05:40.885043 containerd[1650]: time="2025-01-13T21:05:40.885037220Z" level=info msg="Start snapshots syncer" Jan 13 21:05:40.885075 containerd[1650]: time="2025-01-13T21:05:40.885069881Z" level=info msg="Start cni network conf syncer for default" Jan 13 21:05:40.885101 containerd[1650]: time="2025-01-13T21:05:40.885096555Z" level=info msg="Start streaming server" Jan 13 21:05:40.885159 containerd[1650]: time="2025-01-13T21:05:40.885153006Z" level=info msg="containerd successfully booted in 0.031209s" Jan 13 21:05:40.885226 systemd[1]: Started containerd.service - containerd container runtime. Jan 13 21:05:41.015945 tar[1642]: linux-amd64/LICENSE Jan 13 21:05:41.016863 tar[1642]: linux-amd64/README.md Jan 13 21:05:41.026137 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 13 21:05:41.250995 systemd-networkd[1293]: ens192: Gained IPv6LL Jan 13 21:05:41.252419 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 13 21:05:41.253172 systemd[1]: Reached target network-online.target - Network is Online. Jan 13 21:05:41.260045 systemd[1]: Starting coreos-metadata.service - VMware metadata agent... Jan 13 21:05:41.263059 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:05:41.265041 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 13 21:05:41.293194 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 13 21:05:41.295395 systemd[1]: coreos-metadata.service: Deactivated successfully. Jan 13 21:05:41.296423 systemd[1]: Finished coreos-metadata.service - VMware metadata agent. Jan 13 21:05:41.298360 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 13 21:05:42.270979 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:05:42.271686 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 13 21:05:42.272235 systemd[1]: Startup finished in 5.988s (kernel) + 4.178s (userspace) = 10.166s. Jan 13 21:05:42.275790 (kubelet)[1823]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:05:42.300682 login[1746]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 21:05:42.301414 login[1749]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Jan 13 21:05:42.306959 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 13 21:05:42.311973 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 13 21:05:42.314201 systemd-logind[1629]: New session 2 of user core. Jan 13 21:05:42.318759 systemd-logind[1629]: New session 1 of user core. Jan 13 21:05:42.322297 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 13 21:05:42.329060 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 13 21:05:42.330927 (systemd)[1832]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jan 13 21:05:42.386606 systemd[1832]: Queued start job for default target default.target. Jan 13 21:05:42.386854 systemd[1832]: Created slice app.slice - User Application Slice. Jan 13 21:05:42.386870 systemd[1832]: Reached target paths.target - Paths. Jan 13 21:05:42.386879 systemd[1832]: Reached target timers.target - Timers. Jan 13 21:05:42.392852 systemd[1832]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 13 21:05:42.396913 systemd[1832]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 13 21:05:42.398014 systemd[1832]: Reached target sockets.target - Sockets. Jan 13 21:05:42.398036 systemd[1832]: Reached target basic.target - Basic System. Jan 13 21:05:42.398062 systemd[1832]: Reached target default.target - Main User Target. Jan 13 21:05:42.398081 systemd[1832]: Startup finished in 63ms. Jan 13 21:05:42.398247 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 13 21:05:42.403982 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 13 21:05:42.405112 systemd[1]: Started session-2.scope - Session 2 of User core. Jan 13 21:05:42.910778 kubelet[1823]: E0113 21:05:42.910698 1823 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:05:42.912904 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:05:42.913001 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:05:53.163395 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 13 21:05:53.177978 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:05:53.242668 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:05:53.244470 (kubelet)[1885]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:05:53.307438 kubelet[1885]: E0113 21:05:53.307395 1885 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:05:53.309506 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:05:53.309590 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:06:03.560109 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 13 21:06:03.567050 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:03.809512 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:03.811311 (kubelet)[1906]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:06:03.870643 kubelet[1906]: E0113 21:06:03.870609 1906 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:06:03.871871 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:06:03.871972 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:06:13.958135 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 13 21:06:13.966959 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:14.253952 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:14.254060 (kubelet)[1926]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:06:14.345092 kubelet[1926]: E0113 21:06:14.345038 1926 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:06:14.346846 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:06:14.346981 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:06:20.710130 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 13 21:06:20.715248 systemd[1]: Started sshd@0-139.178.70.107:22-147.75.109.163:39994.service - OpenSSH per-connection server daemon (147.75.109.163:39994). Jan 13 21:06:20.750562 sshd[1936]: Accepted publickey for core from 147.75.109.163 port 39994 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:06:20.751206 sshd-session[1936]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:06:20.754454 systemd-logind[1629]: New session 3 of user core. Jan 13 21:06:20.760005 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 13 21:06:20.817862 systemd[1]: Started sshd@1-139.178.70.107:22-147.75.109.163:39998.service - OpenSSH per-connection server daemon (147.75.109.163:39998). Jan 13 21:06:20.844454 sshd[1941]: Accepted publickey for core from 147.75.109.163 port 39998 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:06:20.845326 sshd-session[1941]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:06:20.848607 systemd-logind[1629]: New session 4 of user core. Jan 13 21:06:20.855974 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 13 21:06:20.904163 sshd[1944]: Connection closed by 147.75.109.163 port 39998 Jan 13 21:06:20.904858 sshd-session[1941]: pam_unix(sshd:session): session closed for user core Jan 13 21:06:20.913009 systemd[1]: Started sshd@2-139.178.70.107:22-147.75.109.163:40006.service - OpenSSH per-connection server daemon (147.75.109.163:40006). Jan 13 21:06:20.915146 systemd[1]: sshd@1-139.178.70.107:22-147.75.109.163:39998.service: Deactivated successfully. Jan 13 21:06:20.916144 systemd[1]: session-4.scope: Deactivated successfully. Jan 13 21:06:20.917933 systemd-logind[1629]: Session 4 logged out. Waiting for processes to exit. Jan 13 21:06:20.918653 systemd-logind[1629]: Removed session 4. Jan 13 21:06:20.945683 sshd[1946]: Accepted publickey for core from 147.75.109.163 port 40006 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:06:20.946484 sshd-session[1946]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:06:20.949774 systemd-logind[1629]: New session 5 of user core. Jan 13 21:06:20.958976 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 13 21:06:21.005341 sshd[1952]: Connection closed by 147.75.109.163 port 40006 Jan 13 21:06:21.006912 sshd-session[1946]: pam_unix(sshd:session): session closed for user core Jan 13 21:06:21.014869 systemd[1]: Started sshd@3-139.178.70.107:22-147.75.109.163:40010.service - OpenSSH per-connection server daemon (147.75.109.163:40010). Jan 13 21:06:21.015245 systemd[1]: sshd@2-139.178.70.107:22-147.75.109.163:40006.service: Deactivated successfully. Jan 13 21:06:21.016241 systemd[1]: session-5.scope: Deactivated successfully. Jan 13 21:06:21.018309 systemd-logind[1629]: Session 5 logged out. Waiting for processes to exit. Jan 13 21:06:21.019308 systemd-logind[1629]: Removed session 5. Jan 13 21:06:21.044267 sshd[1954]: Accepted publickey for core from 147.75.109.163 port 40010 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:06:21.044988 sshd-session[1954]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:06:21.048628 systemd-logind[1629]: New session 6 of user core. Jan 13 21:06:21.054060 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 13 21:06:21.103029 sshd[1960]: Connection closed by 147.75.109.163 port 40010 Jan 13 21:06:21.103958 sshd-session[1954]: pam_unix(sshd:session): session closed for user core Jan 13 21:06:21.111006 systemd[1]: Started sshd@4-139.178.70.107:22-147.75.109.163:40016.service - OpenSSH per-connection server daemon (147.75.109.163:40016). Jan 13 21:06:21.111848 systemd[1]: sshd@3-139.178.70.107:22-147.75.109.163:40010.service: Deactivated successfully. Jan 13 21:06:21.112829 systemd[1]: session-6.scope: Deactivated successfully. Jan 13 21:06:21.113663 systemd-logind[1629]: Session 6 logged out. Waiting for processes to exit. Jan 13 21:06:21.115901 systemd-logind[1629]: Removed session 6. Jan 13 21:06:21.143192 sshd[1962]: Accepted publickey for core from 147.75.109.163 port 40016 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:06:21.143965 sshd-session[1962]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:06:21.147689 systemd-logind[1629]: New session 7 of user core. Jan 13 21:06:21.157052 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 13 21:06:21.215456 sudo[1969]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 13 21:06:21.215663 sudo[1969]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:06:21.225621 sudo[1969]: pam_unix(sudo:session): session closed for user root Jan 13 21:06:21.227369 sshd[1968]: Connection closed by 147.75.109.163 port 40016 Jan 13 21:06:21.227942 sshd-session[1962]: pam_unix(sshd:session): session closed for user core Jan 13 21:06:21.237040 systemd[1]: Started sshd@5-139.178.70.107:22-147.75.109.163:40024.service - OpenSSH per-connection server daemon (147.75.109.163:40024). Jan 13 21:06:21.237331 systemd[1]: sshd@4-139.178.70.107:22-147.75.109.163:40016.service: Deactivated successfully. Jan 13 21:06:21.241208 systemd[1]: session-7.scope: Deactivated successfully. Jan 13 21:06:21.241680 systemd-logind[1629]: Session 7 logged out. Waiting for processes to exit. Jan 13 21:06:21.243225 systemd-logind[1629]: Removed session 7. Jan 13 21:06:21.272271 sshd[1971]: Accepted publickey for core from 147.75.109.163 port 40024 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:06:21.272943 sshd-session[1971]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:06:21.275575 systemd-logind[1629]: New session 8 of user core. Jan 13 21:06:21.282004 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 13 21:06:21.330307 sudo[1979]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 13 21:06:21.330512 sudo[1979]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:06:21.332951 sudo[1979]: pam_unix(sudo:session): session closed for user root Jan 13 21:06:21.336582 sudo[1978]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 13 21:06:21.336984 sudo[1978]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:06:21.353096 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 13 21:06:21.368309 augenrules[2001]: No rules Jan 13 21:06:21.368900 systemd[1]: audit-rules.service: Deactivated successfully. Jan 13 21:06:21.369040 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 13 21:06:21.369841 sudo[1978]: pam_unix(sudo:session): session closed for user root Jan 13 21:06:21.370688 sshd[1977]: Connection closed by 147.75.109.163 port 40024 Jan 13 21:06:21.370894 sshd-session[1971]: pam_unix(sshd:session): session closed for user core Jan 13 21:06:21.373387 systemd[1]: sshd@5-139.178.70.107:22-147.75.109.163:40024.service: Deactivated successfully. Jan 13 21:06:21.374657 systemd[1]: session-8.scope: Deactivated successfully. Jan 13 21:06:21.375408 systemd-logind[1629]: Session 8 logged out. Waiting for processes to exit. Jan 13 21:06:21.382341 systemd[1]: Started sshd@6-139.178.70.107:22-147.75.109.163:40030.service - OpenSSH per-connection server daemon (147.75.109.163:40030). Jan 13 21:06:21.383232 systemd-logind[1629]: Removed session 8. Jan 13 21:06:21.408810 sshd[2010]: Accepted publickey for core from 147.75.109.163 port 40030 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:06:21.409725 sshd-session[2010]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:06:21.411990 systemd-logind[1629]: New session 9 of user core. Jan 13 21:06:21.421933 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 13 21:06:21.471146 sudo[2014]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 13 21:06:21.471327 sudo[2014]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 13 21:06:21.844075 (dockerd)[2031]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 13 21:06:21.844319 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 13 21:06:22.086924 dockerd[2031]: time="2025-01-13T21:06:22.086879147Z" level=info msg="Starting up" Jan 13 21:06:22.158862 dockerd[2031]: time="2025-01-13T21:06:22.158677819Z" level=info msg="Loading containers: start." Jan 13 21:06:22.262840 kernel: Initializing XFRM netlink socket Jan 13 21:06:22.313399 systemd-networkd[1293]: docker0: Link UP Jan 13 21:06:22.338738 dockerd[2031]: time="2025-01-13T21:06:22.338711387Z" level=info msg="Loading containers: done." Jan 13 21:06:22.348759 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3446296610-merged.mount: Deactivated successfully. Jan 13 21:06:22.349339 dockerd[2031]: time="2025-01-13T21:06:22.348970054Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 13 21:06:22.349339 dockerd[2031]: time="2025-01-13T21:06:22.349031533Z" level=info msg="Docker daemon" commit=8b539b8df24032dabeaaa099cf1d0535ef0286a3 containerd-snapshotter=false storage-driver=overlay2 version=27.2.1 Jan 13 21:06:22.349339 dockerd[2031]: time="2025-01-13T21:06:22.349083843Z" level=info msg="Daemon has completed initialization" Jan 13 21:06:22.366826 dockerd[2031]: time="2025-01-13T21:06:22.366789376Z" level=info msg="API listen on /run/docker.sock" Jan 13 21:06:22.366978 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 13 21:06:23.115015 containerd[1650]: time="2025-01-13T21:06:23.114914220Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\"" Jan 13 21:06:23.871495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1227449661.mount: Deactivated successfully. Jan 13 21:06:24.457854 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 13 21:06:24.466945 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:24.529089 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:24.531253 (kubelet)[2285]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:06:24.584064 kubelet[2285]: E0113 21:06:24.584004 2285 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:06:24.586002 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:06:24.586098 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:06:25.208312 containerd[1650]: time="2025-01-13T21:06:25.208270037Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:25.219941 containerd[1650]: time="2025-01-13T21:06:25.219905865Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.29.12: active requests=0, bytes read=35139254" Jan 13 21:06:25.229121 containerd[1650]: time="2025-01-13T21:06:25.229081494Z" level=info msg="ImageCreate event name:\"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:25.242887 containerd[1650]: time="2025-01-13T21:06:25.242851309Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:25.243769 containerd[1650]: time="2025-01-13T21:06:25.243636740Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.29.12\" with image id \"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\", repo tag \"registry.k8s.io/kube-apiserver:v1.29.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:2804b1e7b9e08f3a3468f8fd2f6487c55968b9293ee51b9efb865b3298acfa26\", size \"35136054\" in 2.12868366s" Jan 13 21:06:25.243769 containerd[1650]: time="2025-01-13T21:06:25.243661307Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.29.12\" returns image reference \"sha256:92fbbe8caf9c923e0406b93c082b9e7af30032ace2d836c785633f90514bfefa\"" Jan 13 21:06:25.256535 containerd[1650]: time="2025-01-13T21:06:25.256516058Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\"" Jan 13 21:06:26.110837 update_engine[1631]: I20250113 21:06:26.110495 1631 update_attempter.cc:509] Updating boot flags... Jan 13 21:06:26.147597 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2314) Jan 13 21:06:26.360852 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 36 scanned by (udev-worker) (2318) Jan 13 21:06:27.488330 containerd[1650]: time="2025-01-13T21:06:27.488294843Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:27.500143 containerd[1650]: time="2025-01-13T21:06:27.500107615Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.29.12: active requests=0, bytes read=32217732" Jan 13 21:06:27.509055 containerd[1650]: time="2025-01-13T21:06:27.509040720Z" level=info msg="ImageCreate event name:\"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:27.518129 containerd[1650]: time="2025-01-13T21:06:27.518086487Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:27.518950 containerd[1650]: time="2025-01-13T21:06:27.518694120Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.29.12\" with image id \"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.29.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:e2f26a3f5ef3fd01f6330cab8b078cf303cfb6d36911a210d0915d535910e412\", size \"33662844\" in 2.262049547s" Jan 13 21:06:27.518950 containerd[1650]: time="2025-01-13T21:06:27.518717744Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.29.12\" returns image reference \"sha256:f3b58a53109c96b6bf82adb5973fefa4baec46e2e9ee200be5cc03f3afbf127d\"" Jan 13 21:06:27.533826 containerd[1650]: time="2025-01-13T21:06:27.533792591Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\"" Jan 13 21:06:28.681613 containerd[1650]: time="2025-01-13T21:06:28.681563795Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:28.684573 containerd[1650]: time="2025-01-13T21:06:28.684464389Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.29.12: active requests=0, bytes read=17332822" Jan 13 21:06:28.692032 containerd[1650]: time="2025-01-13T21:06:28.691958279Z" level=info msg="ImageCreate event name:\"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:28.707145 containerd[1650]: time="2025-01-13T21:06:28.707099203Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:28.707958 containerd[1650]: time="2025-01-13T21:06:28.707798552Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.29.12\" with image id \"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\", repo tag \"registry.k8s.io/kube-scheduler:v1.29.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:ed66e2102f4705d45de7513decf3ac61879704984409323779d19e98b970568c\", size \"18777952\" in 1.173977496s" Jan 13 21:06:28.707958 containerd[1650]: time="2025-01-13T21:06:28.707832174Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.29.12\" returns image reference \"sha256:e6d3373aa79026111619cc6cc1ffff8b27006c56422e7c95724b03a61b530eaf\"" Jan 13 21:06:28.722067 containerd[1650]: time="2025-01-13T21:06:28.722037434Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\"" Jan 13 21:06:30.417217 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount307040726.mount: Deactivated successfully. Jan 13 21:06:30.916178 containerd[1650]: time="2025-01-13T21:06:30.916120660Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.29.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:30.926547 containerd[1650]: time="2025-01-13T21:06:30.926510413Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.29.12: active requests=0, bytes read=28619958" Jan 13 21:06:30.941859 containerd[1650]: time="2025-01-13T21:06:30.941817469Z" level=info msg="ImageCreate event name:\"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:30.952364 containerd[1650]: time="2025-01-13T21:06:30.952334986Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:30.952971 containerd[1650]: time="2025-01-13T21:06:30.952836536Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.29.12\" with image id \"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\", repo tag \"registry.k8s.io/kube-proxy:v1.29.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:bc761494b78fa152a759457f42bc9b86ee9d18f5929bb127bd5f72f8e2112c39\", size \"28618977\" in 2.230769688s" Jan 13 21:06:30.952971 containerd[1650]: time="2025-01-13T21:06:30.952865603Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.29.12\" returns image reference \"sha256:d699d5830022f9e67c3271d1c2af58eaede81e3567df82728b7d2a8bf12ed153\"" Jan 13 21:06:30.970934 containerd[1650]: time="2025-01-13T21:06:30.970901934Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Jan 13 21:06:31.781373 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2015928450.mount: Deactivated successfully. Jan 13 21:06:33.582553 containerd[1650]: time="2025-01-13T21:06:33.581559609Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:33.586947 containerd[1650]: time="2025-01-13T21:06:33.586905056Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" Jan 13 21:06:33.591683 containerd[1650]: time="2025-01-13T21:06:33.591642524Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:33.594623 containerd[1650]: time="2025-01-13T21:06:33.594597020Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:33.599061 containerd[1650]: time="2025-01-13T21:06:33.598129533Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 2.627194124s" Jan 13 21:06:33.599061 containerd[1650]: time="2025-01-13T21:06:33.598166309Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Jan 13 21:06:33.619504 containerd[1650]: time="2025-01-13T21:06:33.619478047Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Jan 13 21:06:34.312069 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1470132454.mount: Deactivated successfully. Jan 13 21:06:34.384846 containerd[1650]: time="2025-01-13T21:06:34.384202584Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:34.387076 containerd[1650]: time="2025-01-13T21:06:34.387050167Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" Jan 13 21:06:34.391380 containerd[1650]: time="2025-01-13T21:06:34.391361953Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:34.397703 containerd[1650]: time="2025-01-13T21:06:34.397683774Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:34.398401 containerd[1650]: time="2025-01-13T21:06:34.398373488Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 778.868854ms" Jan 13 21:06:34.398459 containerd[1650]: time="2025-01-13T21:06:34.398402525Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Jan 13 21:06:34.419595 containerd[1650]: time="2025-01-13T21:06:34.419566541Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\"" Jan 13 21:06:34.707893 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 13 21:06:34.714917 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:35.318472 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:35.321215 (kubelet)[2422]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 13 21:06:35.754572 kubelet[2422]: E0113 21:06:35.754484 2422 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 13 21:06:35.756526 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 13 21:06:35.756674 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 13 21:06:36.478470 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3081217330.mount: Deactivated successfully. Jan 13 21:06:41.488841 containerd[1650]: time="2025-01-13T21:06:41.488225932Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.10-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:41.498623 containerd[1650]: time="2025-01-13T21:06:41.498573168Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.10-0: active requests=0, bytes read=56651625" Jan 13 21:06:41.501370 containerd[1650]: time="2025-01-13T21:06:41.501335418Z" level=info msg="ImageCreate event name:\"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:41.503732 containerd[1650]: time="2025-01-13T21:06:41.503688800Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:06:41.504788 containerd[1650]: time="2025-01-13T21:06:41.504674939Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.10-0\" with image id \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\", repo tag \"registry.k8s.io/etcd:3.5.10-0\", repo digest \"registry.k8s.io/etcd@sha256:22f892d7672adc0b9c86df67792afdb8b2dc08880f49f669eaaa59c47d7908c2\", size \"56649232\" in 7.084925355s" Jan 13 21:06:41.504788 containerd[1650]: time="2025-01-13T21:06:41.504699604Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.10-0\" returns image reference \"sha256:a0eed15eed4498c145ef2f1883fcd300d7adbb759df73c901abd5383dda668e7\"" Jan 13 21:06:44.311293 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:44.320040 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:44.343388 systemd[1]: Reloading requested from client PID 2540 ('systemctl') (unit session-9.scope)... Jan 13 21:06:44.343398 systemd[1]: Reloading... Jan 13 21:06:44.417896 zram_generator::config[2578]: No configuration found. Jan 13 21:06:44.477314 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 21:06:44.492614 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 21:06:44.532159 systemd[1]: Reloading finished in 188 ms. Jan 13 21:06:44.571379 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 13 21:06:44.571530 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 13 21:06:44.571798 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:44.576082 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:44.982912 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:44.992039 (kubelet)[2655]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 21:06:45.027141 kubelet[2655]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 21:06:45.027141 kubelet[2655]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 21:06:45.027141 kubelet[2655]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 21:06:45.027381 kubelet[2655]: I0113 21:06:45.027174 2655 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 21:06:45.392259 kubelet[2655]: I0113 21:06:45.392188 2655 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Jan 13 21:06:45.392259 kubelet[2655]: I0113 21:06:45.392214 2655 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 21:06:45.392581 kubelet[2655]: I0113 21:06:45.392565 2655 server.go:919] "Client rotation is on, will bootstrap in background" Jan 13 21:06:45.426019 kubelet[2655]: I0113 21:06:45.425837 2655 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 21:06:45.426019 kubelet[2655]: E0113 21:06:45.426000 2655 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.107:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.466253 kubelet[2655]: I0113 21:06:45.465900 2655 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 21:06:45.470330 kubelet[2655]: I0113 21:06:45.470293 2655 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 21:06:45.474123 kubelet[2655]: I0113 21:06:45.474105 2655 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 21:06:45.477126 kubelet[2655]: I0113 21:06:45.477105 2655 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 21:06:45.477126 kubelet[2655]: I0113 21:06:45.477123 2655 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 21:06:45.477224 kubelet[2655]: I0113 21:06:45.477209 2655 state_mem.go:36] "Initialized new in-memory state store" Jan 13 21:06:45.477319 kubelet[2655]: I0113 21:06:45.477306 2655 kubelet.go:396] "Attempting to sync node with API server" Jan 13 21:06:45.477384 kubelet[2655]: I0113 21:06:45.477327 2655 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 21:06:45.477384 kubelet[2655]: I0113 21:06:45.477352 2655 kubelet.go:312] "Adding apiserver pod source" Jan 13 21:06:45.477384 kubelet[2655]: I0113 21:06:45.477360 2655 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 21:06:45.483183 kubelet[2655]: W0113 21:06:45.483062 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.483183 kubelet[2655]: E0113 21:06:45.483093 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.483183 kubelet[2655]: W0113 21:06:45.483130 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.483183 kubelet[2655]: E0113 21:06:45.483148 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.483279 kubelet[2655]: I0113 21:06:45.483235 2655 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 21:06:45.490026 kubelet[2655]: I0113 21:06:45.490001 2655 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 21:06:45.491058 kubelet[2655]: W0113 21:06:45.491035 2655 probe.go:268] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 13 21:06:45.491457 kubelet[2655]: I0113 21:06:45.491444 2655 server.go:1256] "Started kubelet" Jan 13 21:06:45.491819 kubelet[2655]: I0113 21:06:45.491591 2655 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 21:06:45.492221 kubelet[2655]: I0113 21:06:45.492213 2655 server.go:461] "Adding debug handlers to kubelet server" Jan 13 21:06:45.494377 kubelet[2655]: I0113 21:06:45.494367 2655 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 21:06:45.497424 kubelet[2655]: I0113 21:06:45.497404 2655 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 21:06:45.497557 kubelet[2655]: I0113 21:06:45.497541 2655 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 21:06:45.499560 kubelet[2655]: E0113 21:06:45.499525 2655 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.107:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.107:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181a5c95f85c40b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 21:06:45.491425462 +0000 UTC m=+0.496223854,LastTimestamp:2025-01-13 21:06:45.491425462 +0000 UTC m=+0.496223854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 21:06:45.507284 kubelet[2655]: I0113 21:06:45.506950 2655 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 21:06:45.508440 kubelet[2655]: E0113 21:06:45.508431 2655 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="200ms" Jan 13 21:06:45.508736 kubelet[2655]: I0113 21:06:45.508493 2655 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Jan 13 21:06:45.508736 kubelet[2655]: W0113 21:06:45.508665 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.508736 kubelet[2655]: E0113 21:06:45.508688 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.508736 kubelet[2655]: I0113 21:06:45.508722 2655 reconciler_new.go:29] "Reconciler: start to sync state" Jan 13 21:06:45.509596 kubelet[2655]: I0113 21:06:45.509587 2655 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 21:06:45.516227 kubelet[2655]: I0113 21:06:45.516219 2655 factory.go:221] Registration of the containerd container factory successfully Jan 13 21:06:45.516290 kubelet[2655]: I0113 21:06:45.516286 2655 factory.go:221] Registration of the systemd container factory successfully Jan 13 21:06:45.525188 kubelet[2655]: I0113 21:06:45.525174 2655 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 21:06:45.526051 kubelet[2655]: I0113 21:06:45.525892 2655 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 21:06:45.526051 kubelet[2655]: I0113 21:06:45.525911 2655 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 21:06:45.526051 kubelet[2655]: I0113 21:06:45.525924 2655 kubelet.go:2329] "Starting kubelet main sync loop" Jan 13 21:06:45.526051 kubelet[2655]: E0113 21:06:45.525953 2655 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 21:06:45.533816 kubelet[2655]: W0113 21:06:45.532826 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.533816 kubelet[2655]: E0113 21:06:45.532860 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:45.548478 kubelet[2655]: I0113 21:06:45.548463 2655 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 21:06:45.548573 kubelet[2655]: I0113 21:06:45.548567 2655 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 21:06:45.548611 kubelet[2655]: I0113 21:06:45.548607 2655 state_mem.go:36] "Initialized new in-memory state store" Jan 13 21:06:45.552701 kubelet[2655]: I0113 21:06:45.552691 2655 policy_none.go:49] "None policy: Start" Jan 13 21:06:45.553096 kubelet[2655]: I0113 21:06:45.553082 2655 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 21:06:45.553133 kubelet[2655]: I0113 21:06:45.553099 2655 state_mem.go:35] "Initializing new in-memory state store" Jan 13 21:06:45.560470 kubelet[2655]: I0113 21:06:45.560450 2655 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 21:06:45.560631 kubelet[2655]: I0113 21:06:45.560619 2655 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 21:06:45.563868 kubelet[2655]: E0113 21:06:45.563847 2655 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Jan 13 21:06:45.608190 kubelet[2655]: I0113 21:06:45.608117 2655 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 21:06:45.608385 kubelet[2655]: E0113 21:06:45.608370 2655 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Jan 13 21:06:45.626919 kubelet[2655]: I0113 21:06:45.626775 2655 topology_manager.go:215] "Topology Admit Handler" podUID="2c8dff547e9b1d3642485f1ac66eb3b6" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 21:06:45.630923 kubelet[2655]: I0113 21:06:45.630135 2655 topology_manager.go:215] "Topology Admit Handler" podUID="4f8e0d694c07e04969646aa3c152c34a" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 21:06:45.632261 kubelet[2655]: I0113 21:06:45.631116 2655 topology_manager.go:215] "Topology Admit Handler" podUID="c4144e8f85b2123a6afada0c1705bbba" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 21:06:45.709404 kubelet[2655]: E0113 21:06:45.709379 2655 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="400ms" Jan 13 21:06:45.711196 kubelet[2655]: I0113 21:06:45.711112 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2c8dff547e9b1d3642485f1ac66eb3b6-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"2c8dff547e9b1d3642485f1ac66eb3b6\") " pod="kube-system/kube-apiserver-localhost" Jan 13 21:06:45.711196 kubelet[2655]: I0113 21:06:45.711138 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:45.711196 kubelet[2655]: I0113 21:06:45.711162 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:45.711277 kubelet[2655]: I0113 21:06:45.711206 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:45.711277 kubelet[2655]: I0113 21:06:45.711232 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:45.711277 kubelet[2655]: I0113 21:06:45.711250 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c4144e8f85b2123a6afada0c1705bbba-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c4144e8f85b2123a6afada0c1705bbba\") " pod="kube-system/kube-scheduler-localhost" Jan 13 21:06:45.711277 kubelet[2655]: I0113 21:06:45.711262 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2c8dff547e9b1d3642485f1ac66eb3b6-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"2c8dff547e9b1d3642485f1ac66eb3b6\") " pod="kube-system/kube-apiserver-localhost" Jan 13 21:06:45.711277 kubelet[2655]: I0113 21:06:45.711274 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2c8dff547e9b1d3642485f1ac66eb3b6-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"2c8dff547e9b1d3642485f1ac66eb3b6\") " pod="kube-system/kube-apiserver-localhost" Jan 13 21:06:45.711364 kubelet[2655]: I0113 21:06:45.711286 2655 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:45.809605 kubelet[2655]: I0113 21:06:45.809405 2655 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 21:06:45.809605 kubelet[2655]: E0113 21:06:45.809611 2655 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Jan 13 21:06:45.934162 containerd[1650]: time="2025-01-13T21:06:45.934086084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:2c8dff547e9b1d3642485f1ac66eb3b6,Namespace:kube-system,Attempt:0,}" Jan 13 21:06:45.937530 containerd[1650]: time="2025-01-13T21:06:45.937358624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c4144e8f85b2123a6afada0c1705bbba,Namespace:kube-system,Attempt:0,}" Jan 13 21:06:45.937810 containerd[1650]: time="2025-01-13T21:06:45.937786862Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:4f8e0d694c07e04969646aa3c152c34a,Namespace:kube-system,Attempt:0,}" Jan 13 21:06:46.074799 kubelet[2655]: E0113 21:06:46.074713 2655 event.go:355] "Unable to write event (may retry after sleeping)" err="Post \"https://139.178.70.107:6443/api/v1/namespaces/default/events\": dial tcp 139.178.70.107:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.181a5c95f85c40b6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-01-13 21:06:45.491425462 +0000 UTC m=+0.496223854,LastTimestamp:2025-01-13 21:06:45.491425462 +0000 UTC m=+0.496223854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Jan 13 21:06:46.110285 kubelet[2655]: E0113 21:06:46.110248 2655 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="800ms" Jan 13 21:06:46.211562 kubelet[2655]: I0113 21:06:46.211534 2655 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 21:06:46.211810 kubelet[2655]: E0113 21:06:46.211791 2655 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Jan 13 21:06:46.437828 kubelet[2655]: W0113 21:06:46.437718 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:46.437828 kubelet[2655]: E0113 21:06:46.437758 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://139.178.70.107:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:46.437828 kubelet[2655]: W0113 21:06:46.437718 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:46.437828 kubelet[2655]: E0113 21:06:46.437771 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://139.178.70.107:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:46.614935 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1689236954.mount: Deactivated successfully. Jan 13 21:06:46.635738 containerd[1650]: time="2025-01-13T21:06:46.635648793Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:06:46.644829 containerd[1650]: time="2025-01-13T21:06:46.644716607Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=312056" Jan 13 21:06:46.651550 containerd[1650]: time="2025-01-13T21:06:46.651472225Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:06:46.655817 containerd[1650]: time="2025-01-13T21:06:46.654323337Z" level=info msg="ImageCreate event name:\"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:06:46.663165 containerd[1650]: time="2025-01-13T21:06:46.663135989Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 21:06:46.667610 containerd[1650]: time="2025-01-13T21:06:46.667582293Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:06:46.672626 containerd[1650]: time="2025-01-13T21:06:46.672598002Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 13 21:06:46.673390 containerd[1650]: time="2025-01-13T21:06:46.673362832Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 735.494698ms" Jan 13 21:06:46.675729 containerd[1650]: time="2025-01-13T21:06:46.674720123Z" level=info msg="stop pulling image registry.k8s.io/pause:3.8: active requests=0, bytes read=0" Jan 13 21:06:46.678093 containerd[1650]: time="2025-01-13T21:06:46.677625945Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 740.216595ms" Jan 13 21:06:46.681856 containerd[1650]: time="2025-01-13T21:06:46.681627363Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.8\" with image id \"sha256:4873874c08efc72e9729683a83ffbb7502ee729e9a5ac097723806ea7fa13517\", repo tag \"registry.k8s.io/pause:3.8\", repo digest \"registry.k8s.io/pause@sha256:9001185023633d17a2f98ff69b6ff2615b8ea02a825adffa40422f51dfdcde9d\", size \"311286\" in 747.458196ms" Jan 13 21:06:46.810499 containerd[1650]: time="2025-01-13T21:06:46.810232158Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:06:46.810819 containerd[1650]: time="2025-01-13T21:06:46.810608000Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:06:46.810819 containerd[1650]: time="2025-01-13T21:06:46.810621835Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:06:46.810901 containerd[1650]: time="2025-01-13T21:06:46.810837961Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:06:46.817015 containerd[1650]: time="2025-01-13T21:06:46.807423850Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:06:46.817181 containerd[1650]: time="2025-01-13T21:06:46.817027730Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:06:46.817280 containerd[1650]: time="2025-01-13T21:06:46.817041808Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:06:46.817280 containerd[1650]: time="2025-01-13T21:06:46.817197896Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:06:46.818251 containerd[1650]: time="2025-01-13T21:06:46.818143366Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:06:46.818251 containerd[1650]: time="2025-01-13T21:06:46.818169115Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:06:46.818251 containerd[1650]: time="2025-01-13T21:06:46.818177974Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:06:46.818251 containerd[1650]: time="2025-01-13T21:06:46.818215086Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:06:46.877702 containerd[1650]: time="2025-01-13T21:06:46.877666755Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:4f8e0d694c07e04969646aa3c152c34a,Namespace:kube-system,Attempt:0,} returns sandbox id \"4be647f7757cf6c7aad9a09c0f7b9dbd0eb6bb12c50fb55e7eecfc8ad9b67877\"" Jan 13 21:06:46.880832 containerd[1650]: time="2025-01-13T21:06:46.879892007Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:2c8dff547e9b1d3642485f1ac66eb3b6,Namespace:kube-system,Attempt:0,} returns sandbox id \"0977d3b3503cf7cf096ccf855302a113c378838c44b73489a76bc32e3b10ba3a\"" Jan 13 21:06:46.896823 containerd[1650]: time="2025-01-13T21:06:46.896733135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:c4144e8f85b2123a6afada0c1705bbba,Namespace:kube-system,Attempt:0,} returns sandbox id \"570d7289e831e09370c748338b9da438ac4c8546da80e9d16d90a8c9c2849d16\"" Jan 13 21:06:46.900988 kubelet[2655]: W0113 21:06:46.895081 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:46.900988 kubelet[2655]: E0113 21:06:46.899766 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://139.178.70.107:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:46.901403 containerd[1650]: time="2025-01-13T21:06:46.901252927Z" level=info msg="CreateContainer within sandbox \"0977d3b3503cf7cf096ccf855302a113c378838c44b73489a76bc32e3b10ba3a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 13 21:06:46.902078 containerd[1650]: time="2025-01-13T21:06:46.902030861Z" level=info msg="CreateContainer within sandbox \"4be647f7757cf6c7aad9a09c0f7b9dbd0eb6bb12c50fb55e7eecfc8ad9b67877\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 13 21:06:46.902481 containerd[1650]: time="2025-01-13T21:06:46.902368098Z" level=info msg="CreateContainer within sandbox \"570d7289e831e09370c748338b9da438ac4c8546da80e9d16d90a8c9c2849d16\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 13 21:06:46.910990 kubelet[2655]: E0113 21:06:46.910970 2655 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://139.178.70.107:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 139.178.70.107:6443: connect: connection refused" interval="1.6s" Jan 13 21:06:46.975013 containerd[1650]: time="2025-01-13T21:06:46.974977549Z" level=info msg="CreateContainer within sandbox \"0977d3b3503cf7cf096ccf855302a113c378838c44b73489a76bc32e3b10ba3a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"afe87da313917b544a8b96677a1d29b26759e0b33b1b78fd33855e31a3dacb7a\"" Jan 13 21:06:46.975469 containerd[1650]: time="2025-01-13T21:06:46.975432573Z" level=info msg="StartContainer for \"afe87da313917b544a8b96677a1d29b26759e0b33b1b78fd33855e31a3dacb7a\"" Jan 13 21:06:46.986827 kubelet[2655]: W0113 21:06:46.986775 2655 reflector.go:539] vendor/k8s.io/client-go/informers/factory.go:159: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:46.989015 kubelet[2655]: E0113 21:06:46.986845 2655 reflector.go:147] vendor/k8s.io/client-go/informers/factory.go:159: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://139.178.70.107:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:47.028876 kubelet[2655]: I0113 21:06:47.028861 2655 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 21:06:47.029328 kubelet[2655]: E0113 21:06:47.029318 2655 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://139.178.70.107:6443/api/v1/nodes\": dial tcp 139.178.70.107:6443: connect: connection refused" node="localhost" Jan 13 21:06:47.036193 containerd[1650]: time="2025-01-13T21:06:47.036172973Z" level=info msg="CreateContainer within sandbox \"4be647f7757cf6c7aad9a09c0f7b9dbd0eb6bb12c50fb55e7eecfc8ad9b67877\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"316fa58ad7a43ca59ec578942813ba2fd80251506160d5e5ec9ecb69021c1207\"" Jan 13 21:06:47.036821 containerd[1650]: time="2025-01-13T21:06:47.036254721Z" level=info msg="StartContainer for \"afe87da313917b544a8b96677a1d29b26759e0b33b1b78fd33855e31a3dacb7a\" returns successfully" Jan 13 21:06:47.036821 containerd[1650]: time="2025-01-13T21:06:47.036498405Z" level=info msg="StartContainer for \"316fa58ad7a43ca59ec578942813ba2fd80251506160d5e5ec9ecb69021c1207\"" Jan 13 21:06:47.048595 containerd[1650]: time="2025-01-13T21:06:47.048518941Z" level=info msg="CreateContainer within sandbox \"570d7289e831e09370c748338b9da438ac4c8546da80e9d16d90a8c9c2849d16\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"20e8974a2caf5e626f8ca007e26049c0b9485b8674315662410a76531b2e504c\"" Jan 13 21:06:47.049611 containerd[1650]: time="2025-01-13T21:06:47.048960804Z" level=info msg="StartContainer for \"20e8974a2caf5e626f8ca007e26049c0b9485b8674315662410a76531b2e504c\"" Jan 13 21:06:47.094964 containerd[1650]: time="2025-01-13T21:06:47.094707612Z" level=info msg="StartContainer for \"316fa58ad7a43ca59ec578942813ba2fd80251506160d5e5ec9ecb69021c1207\" returns successfully" Jan 13 21:06:47.111078 containerd[1650]: time="2025-01-13T21:06:47.111046567Z" level=info msg="StartContainer for \"20e8974a2caf5e626f8ca007e26049c0b9485b8674315662410a76531b2e504c\" returns successfully" Jan 13 21:06:47.619265 kubelet[2655]: E0113 21:06:47.619243 2655 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://139.178.70.107:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 139.178.70.107:6443: connect: connection refused Jan 13 21:06:48.630941 kubelet[2655]: I0113 21:06:48.630620 2655 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 21:06:49.042197 kubelet[2655]: E0113 21:06:49.042166 2655 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Jan 13 21:06:49.112980 kubelet[2655]: I0113 21:06:49.112531 2655 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 21:06:49.122417 kubelet[2655]: E0113 21:06:49.122393 2655 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 21:06:49.223842 kubelet[2655]: E0113 21:06:49.223790 2655 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Jan 13 21:06:49.484900 kubelet[2655]: I0113 21:06:49.484439 2655 apiserver.go:52] "Watching apiserver" Jan 13 21:06:49.508830 kubelet[2655]: I0113 21:06:49.508799 2655 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Jan 13 21:06:51.685413 systemd[1]: Reloading requested from client PID 2929 ('systemctl') (unit session-9.scope)... Jan 13 21:06:51.685428 systemd[1]: Reloading... Jan 13 21:06:51.738826 zram_generator::config[2970]: No configuration found. Jan 13 21:06:51.816007 systemd[1]: /etc/systemd/system/coreos-metadata.service:11: Ignoring unknown escape sequences: "echo "COREOS_CUSTOM_PRIVATE_IPV4=$(ip addr show ens192 | grep "inet 10." | grep -Po "inet \K[\d.]+") Jan 13 21:06:51.835380 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jan 13 21:06:51.882892 systemd[1]: Reloading finished in 197 ms. Jan 13 21:06:51.900479 kubelet[2655]: I0113 21:06:51.900412 2655 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 21:06:51.900491 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:51.911605 systemd[1]: kubelet.service: Deactivated successfully. Jan 13 21:06:51.911781 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:51.918980 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 13 21:06:52.283284 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 13 21:06:52.288731 (kubelet)[3044]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 13 21:06:52.362466 kubelet[3044]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 21:06:52.362466 kubelet[3044]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 13 21:06:52.362466 kubelet[3044]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 13 21:06:52.362750 kubelet[3044]: I0113 21:06:52.362487 3044 server.go:204] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 13 21:06:52.366792 kubelet[3044]: I0113 21:06:52.366771 3044 server.go:487] "Kubelet version" kubeletVersion="v1.29.2" Jan 13 21:06:52.366792 kubelet[3044]: I0113 21:06:52.366788 3044 server.go:489] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 13 21:06:52.366931 kubelet[3044]: I0113 21:06:52.366919 3044 server.go:919] "Client rotation is on, will bootstrap in background" Jan 13 21:06:52.369404 kubelet[3044]: I0113 21:06:52.369128 3044 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 13 21:06:52.377319 kubelet[3044]: I0113 21:06:52.376735 3044 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 13 21:06:52.386192 kubelet[3044]: I0113 21:06:52.386172 3044 server.go:745] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 13 21:06:52.386480 kubelet[3044]: I0113 21:06:52.386468 3044 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 13 21:06:52.386595 kubelet[3044]: I0113 21:06:52.386584 3044 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Jan 13 21:06:52.386663 kubelet[3044]: I0113 21:06:52.386602 3044 topology_manager.go:138] "Creating topology manager with none policy" Jan 13 21:06:52.386663 kubelet[3044]: I0113 21:06:52.386609 3044 container_manager_linux.go:301] "Creating device plugin manager" Jan 13 21:06:52.386663 kubelet[3044]: I0113 21:06:52.386633 3044 state_mem.go:36] "Initialized new in-memory state store" Jan 13 21:06:52.386722 kubelet[3044]: I0113 21:06:52.386689 3044 kubelet.go:396] "Attempting to sync node with API server" Jan 13 21:06:52.386722 kubelet[3044]: I0113 21:06:52.386702 3044 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 13 21:06:52.386722 kubelet[3044]: I0113 21:06:52.386720 3044 kubelet.go:312] "Adding apiserver pod source" Jan 13 21:06:52.386838 kubelet[3044]: I0113 21:06:52.386732 3044 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 13 21:06:52.388725 kubelet[3044]: I0113 21:06:52.388360 3044 kuberuntime_manager.go:258] "Container runtime initialized" containerRuntime="containerd" version="v1.7.23" apiVersion="v1" Jan 13 21:06:52.388725 kubelet[3044]: I0113 21:06:52.388507 3044 kubelet.go:809] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 13 21:06:52.388933 kubelet[3044]: I0113 21:06:52.388759 3044 server.go:1256] "Started kubelet" Jan 13 21:06:52.391631 kubelet[3044]: I0113 21:06:52.390936 3044 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 13 21:06:52.403190 kubelet[3044]: I0113 21:06:52.403166 3044 server.go:162] "Starting to listen" address="0.0.0.0" port=10250 Jan 13 21:06:52.406664 kubelet[3044]: I0113 21:06:52.406375 3044 server.go:461] "Adding debug handlers to kubelet server" Jan 13 21:06:52.410114 kubelet[3044]: I0113 21:06:52.410091 3044 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 13 21:06:52.410227 kubelet[3044]: I0113 21:06:52.410218 3044 server.go:233] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 13 21:06:52.413429 kubelet[3044]: I0113 21:06:52.413162 3044 volume_manager.go:291] "Starting Kubelet Volume Manager" Jan 13 21:06:52.420382 kubelet[3044]: I0113 21:06:52.420369 3044 factory.go:221] Registration of the systemd container factory successfully Jan 13 21:06:52.420638 kubelet[3044]: I0113 21:06:52.420567 3044 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 13 21:06:52.423357 kubelet[3044]: I0113 21:06:52.423334 3044 desired_state_of_world_populator.go:151] "Desired state populator starts to run" Jan 13 21:06:52.423472 kubelet[3044]: I0113 21:06:52.423462 3044 reconciler_new.go:29] "Reconciler: start to sync state" Jan 13 21:06:52.424780 kubelet[3044]: E0113 21:06:52.424766 3044 kubelet.go:1462] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 13 21:06:52.428543 kubelet[3044]: I0113 21:06:52.428495 3044 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 13 21:06:52.431956 kubelet[3044]: I0113 21:06:52.429056 3044 factory.go:221] Registration of the containerd container factory successfully Jan 13 21:06:52.433026 kubelet[3044]: I0113 21:06:52.432926 3044 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 13 21:06:52.433026 kubelet[3044]: I0113 21:06:52.432948 3044 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 13 21:06:52.433026 kubelet[3044]: I0113 21:06:52.432963 3044 kubelet.go:2329] "Starting kubelet main sync loop" Jan 13 21:06:52.433026 kubelet[3044]: E0113 21:06:52.432991 3044 kubelet.go:2353] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 13 21:06:52.481025 kubelet[3044]: I0113 21:06:52.481000 3044 cpu_manager.go:214] "Starting CPU manager" policy="none" Jan 13 21:06:52.481658 kubelet[3044]: I0113 21:06:52.481645 3044 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jan 13 21:06:52.481685 kubelet[3044]: I0113 21:06:52.481662 3044 state_mem.go:36] "Initialized new in-memory state store" Jan 13 21:06:52.481787 kubelet[3044]: I0113 21:06:52.481776 3044 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 13 21:06:52.481852 kubelet[3044]: I0113 21:06:52.481795 3044 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 13 21:06:52.481852 kubelet[3044]: I0113 21:06:52.481800 3044 policy_none.go:49] "None policy: Start" Jan 13 21:06:52.482135 kubelet[3044]: I0113 21:06:52.482122 3044 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 13 21:06:52.482135 kubelet[3044]: I0113 21:06:52.482135 3044 state_mem.go:35] "Initializing new in-memory state store" Jan 13 21:06:52.482269 kubelet[3044]: I0113 21:06:52.482256 3044 state_mem.go:75] "Updated machine memory state" Jan 13 21:06:52.483045 kubelet[3044]: I0113 21:06:52.482955 3044 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 13 21:06:52.483270 kubelet[3044]: I0113 21:06:52.483257 3044 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 13 21:06:52.517189 kubelet[3044]: I0113 21:06:52.517162 3044 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Jan 13 21:06:52.520728 kubelet[3044]: I0113 21:06:52.520711 3044 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Jan 13 21:06:52.520986 kubelet[3044]: I0113 21:06:52.520901 3044 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Jan 13 21:06:52.533370 kubelet[3044]: I0113 21:06:52.533281 3044 topology_manager.go:215] "Topology Admit Handler" podUID="c4144e8f85b2123a6afada0c1705bbba" podNamespace="kube-system" podName="kube-scheduler-localhost" Jan 13 21:06:52.534792 kubelet[3044]: I0113 21:06:52.533523 3044 topology_manager.go:215] "Topology Admit Handler" podUID="2c8dff547e9b1d3642485f1ac66eb3b6" podNamespace="kube-system" podName="kube-apiserver-localhost" Jan 13 21:06:52.534792 kubelet[3044]: I0113 21:06:52.533564 3044 topology_manager.go:215] "Topology Admit Handler" podUID="4f8e0d694c07e04969646aa3c152c34a" podNamespace="kube-system" podName="kube-controller-manager-localhost" Jan 13 21:06:52.538795 kubelet[3044]: E0113 21:06:52.538772 3044 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:52.726475 kubelet[3044]: I0113 21:06:52.726447 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:52.726475 kubelet[3044]: I0113 21:06:52.726478 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2c8dff547e9b1d3642485f1ac66eb3b6-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"2c8dff547e9b1d3642485f1ac66eb3b6\") " pod="kube-system/kube-apiserver-localhost" Jan 13 21:06:52.726620 kubelet[3044]: I0113 21:06:52.726496 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2c8dff547e9b1d3642485f1ac66eb3b6-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"2c8dff547e9b1d3642485f1ac66eb3b6\") " pod="kube-system/kube-apiserver-localhost" Jan 13 21:06:52.726620 kubelet[3044]: I0113 21:06:52.726510 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:52.726620 kubelet[3044]: I0113 21:06:52.726524 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:52.726620 kubelet[3044]: I0113 21:06:52.726535 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:52.726620 kubelet[3044]: I0113 21:06:52.726548 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4f8e0d694c07e04969646aa3c152c34a-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"4f8e0d694c07e04969646aa3c152c34a\") " pod="kube-system/kube-controller-manager-localhost" Jan 13 21:06:52.726759 kubelet[3044]: I0113 21:06:52.726559 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c4144e8f85b2123a6afada0c1705bbba-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"c4144e8f85b2123a6afada0c1705bbba\") " pod="kube-system/kube-scheduler-localhost" Jan 13 21:06:52.726759 kubelet[3044]: I0113 21:06:52.726570 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2c8dff547e9b1d3642485f1ac66eb3b6-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"2c8dff547e9b1d3642485f1ac66eb3b6\") " pod="kube-system/kube-apiserver-localhost" Jan 13 21:06:53.387288 kubelet[3044]: I0113 21:06:53.387115 3044 apiserver.go:52] "Watching apiserver" Jan 13 21:06:53.424203 kubelet[3044]: I0113 21:06:53.424156 3044 desired_state_of_world_populator.go:159] "Finished populating initial desired state of world" Jan 13 21:06:53.465016 kubelet[3044]: E0113 21:06:53.464727 3044 kubelet.go:1921] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Jan 13 21:06:53.485917 kubelet[3044]: I0113 21:06:53.485898 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.484627722 podStartE2EDuration="1.484627722s" podCreationTimestamp="2025-01-13 21:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 21:06:53.480344698 +0000 UTC m=+1.175893188" watchObservedRunningTime="2025-01-13 21:06:53.484627722 +0000 UTC m=+1.180176210" Jan 13 21:06:53.486096 kubelet[3044]: I0113 21:06:53.486088 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=3.486070377 podStartE2EDuration="3.486070377s" podCreationTimestamp="2025-01-13 21:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 21:06:53.48423814 +0000 UTC m=+1.179786630" watchObservedRunningTime="2025-01-13 21:06:53.486070377 +0000 UTC m=+1.181618869" Jan 13 21:06:53.489489 kubelet[3044]: I0113 21:06:53.488986 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.48896754 podStartE2EDuration="1.48896754s" podCreationTimestamp="2025-01-13 21:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 21:06:53.488631133 +0000 UTC m=+1.184179629" watchObservedRunningTime="2025-01-13 21:06:53.48896754 +0000 UTC m=+1.184516036" Jan 13 21:06:56.488305 sudo[2014]: pam_unix(sudo:session): session closed for user root Jan 13 21:06:56.489561 sshd[2013]: Connection closed by 147.75.109.163 port 40030 Jan 13 21:06:56.493509 sshd-session[2010]: pam_unix(sshd:session): session closed for user core Jan 13 21:06:56.496102 systemd[1]: sshd@6-139.178.70.107:22-147.75.109.163:40030.service: Deactivated successfully. Jan 13 21:06:56.500702 systemd[1]: session-9.scope: Deactivated successfully. Jan 13 21:06:56.502545 systemd-logind[1629]: Session 9 logged out. Waiting for processes to exit. Jan 13 21:06:56.503408 systemd-logind[1629]: Removed session 9. Jan 13 21:07:05.294888 kubelet[3044]: I0113 21:07:05.294870 3044 kuberuntime_manager.go:1529] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 13 21:07:05.296844 containerd[1650]: time="2025-01-13T21:07:05.295369052Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 13 21:07:05.297049 kubelet[3044]: I0113 21:07:05.295494 3044 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 13 21:07:05.879102 kubelet[3044]: I0113 21:07:05.878865 3044 topology_manager.go:215] "Topology Admit Handler" podUID="e997024a-6109-48b9-98ff-3956e122d97f" podNamespace="kube-system" podName="kube-proxy-78zkz" Jan 13 21:07:05.904682 kubelet[3044]: I0113 21:07:05.904111 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e997024a-6109-48b9-98ff-3956e122d97f-lib-modules\") pod \"kube-proxy-78zkz\" (UID: \"e997024a-6109-48b9-98ff-3956e122d97f\") " pod="kube-system/kube-proxy-78zkz" Jan 13 21:07:05.904682 kubelet[3044]: I0113 21:07:05.904142 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5xv7\" (UniqueName: \"kubernetes.io/projected/e997024a-6109-48b9-98ff-3956e122d97f-kube-api-access-f5xv7\") pod \"kube-proxy-78zkz\" (UID: \"e997024a-6109-48b9-98ff-3956e122d97f\") " pod="kube-system/kube-proxy-78zkz" Jan 13 21:07:05.904682 kubelet[3044]: I0113 21:07:05.904160 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/e997024a-6109-48b9-98ff-3956e122d97f-kube-proxy\") pod \"kube-proxy-78zkz\" (UID: \"e997024a-6109-48b9-98ff-3956e122d97f\") " pod="kube-system/kube-proxy-78zkz" Jan 13 21:07:05.904682 kubelet[3044]: I0113 21:07:05.904174 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e997024a-6109-48b9-98ff-3956e122d97f-xtables-lock\") pod \"kube-proxy-78zkz\" (UID: \"e997024a-6109-48b9-98ff-3956e122d97f\") " pod="kube-system/kube-proxy-78zkz" Jan 13 21:07:06.075836 kubelet[3044]: I0113 21:07:06.071159 3044 topology_manager.go:215] "Topology Admit Handler" podUID="3e1429e1-4809-4044-8efd-9a9972500461" podNamespace="tigera-operator" podName="tigera-operator-c7ccbd65-qx4n4" Jan 13 21:07:06.106115 kubelet[3044]: I0113 21:07:06.106085 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9ts\" (UniqueName: \"kubernetes.io/projected/3e1429e1-4809-4044-8efd-9a9972500461-kube-api-access-dr9ts\") pod \"tigera-operator-c7ccbd65-qx4n4\" (UID: \"3e1429e1-4809-4044-8efd-9a9972500461\") " pod="tigera-operator/tigera-operator-c7ccbd65-qx4n4" Jan 13 21:07:06.106218 kubelet[3044]: I0113 21:07:06.106129 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3e1429e1-4809-4044-8efd-9a9972500461-var-lib-calico\") pod \"tigera-operator-c7ccbd65-qx4n4\" (UID: \"3e1429e1-4809-4044-8efd-9a9972500461\") " pod="tigera-operator/tigera-operator-c7ccbd65-qx4n4" Jan 13 21:07:06.215548 containerd[1650]: time="2025-01-13T21:07:06.215522059Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-78zkz,Uid:e997024a-6109-48b9-98ff-3956e122d97f,Namespace:kube-system,Attempt:0,}" Jan 13 21:07:06.325165 containerd[1650]: time="2025-01-13T21:07:06.325106566Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:06.325165 containerd[1650]: time="2025-01-13T21:07:06.325165372Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:06.325509 containerd[1650]: time="2025-01-13T21:07:06.325192520Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:06.325509 containerd[1650]: time="2025-01-13T21:07:06.325261088Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:06.349457 containerd[1650]: time="2025-01-13T21:07:06.349427329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-78zkz,Uid:e997024a-6109-48b9-98ff-3956e122d97f,Namespace:kube-system,Attempt:0,} returns sandbox id \"30a0448b6bb24b3705a97bbfefd63b4b94d4d95954672c9c390564f3bf888692\"" Jan 13 21:07:06.351643 containerd[1650]: time="2025-01-13T21:07:06.351561044Z" level=info msg="CreateContainer within sandbox \"30a0448b6bb24b3705a97bbfefd63b4b94d4d95954672c9c390564f3bf888692\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 13 21:07:06.381293 containerd[1650]: time="2025-01-13T21:07:06.381271979Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-qx4n4,Uid:3e1429e1-4809-4044-8efd-9a9972500461,Namespace:tigera-operator,Attempt:0,}" Jan 13 21:07:06.445913 containerd[1650]: time="2025-01-13T21:07:06.445814743Z" level=info msg="CreateContainer within sandbox \"30a0448b6bb24b3705a97bbfefd63b4b94d4d95954672c9c390564f3bf888692\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"64970f97bba995116e34d5a94fd280259f3b4e10d13b6b4609e813b42802736b\"" Jan 13 21:07:06.447511 containerd[1650]: time="2025-01-13T21:07:06.447489600Z" level=info msg="StartContainer for \"64970f97bba995116e34d5a94fd280259f3b4e10d13b6b4609e813b42802736b\"" Jan 13 21:07:06.458215 containerd[1650]: time="2025-01-13T21:07:06.458107729Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:06.458215 containerd[1650]: time="2025-01-13T21:07:06.458163230Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:06.458215 containerd[1650]: time="2025-01-13T21:07:06.458176496Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:06.458560 containerd[1650]: time="2025-01-13T21:07:06.458485943Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:06.513999 containerd[1650]: time="2025-01-13T21:07:06.513943990Z" level=info msg="StartContainer for \"64970f97bba995116e34d5a94fd280259f3b4e10d13b6b4609e813b42802736b\" returns successfully" Jan 13 21:07:06.516529 containerd[1650]: time="2025-01-13T21:07:06.516449695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-c7ccbd65-qx4n4,Uid:3e1429e1-4809-4044-8efd-9a9972500461,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"74ee9a9dede4e0df9b17a7a35bf8f3aba1e9de0bfab67394f9f8e9cbe75e0942\"" Jan 13 21:07:06.520880 containerd[1650]: time="2025-01-13T21:07:06.520835702Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Jan 13 21:07:07.487708 kubelet[3044]: I0113 21:07:07.487596 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/kube-proxy-78zkz" podStartSLOduration=2.485933009 podStartE2EDuration="2.485933009s" podCreationTimestamp="2025-01-13 21:07:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 21:07:07.485398513 +0000 UTC m=+15.180947009" watchObservedRunningTime="2025-01-13 21:07:07.485933009 +0000 UTC m=+15.181481500" Jan 13 21:07:08.515326 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount352807316.mount: Deactivated successfully. Jan 13 21:07:08.882885 containerd[1650]: time="2025-01-13T21:07:08.882782952Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:08.894100 containerd[1650]: time="2025-01-13T21:07:08.894059254Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.2: active requests=0, bytes read=21764341" Jan 13 21:07:08.906658 containerd[1650]: time="2025-01-13T21:07:08.906620949Z" level=info msg="ImageCreate event name:\"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:08.917631 containerd[1650]: time="2025-01-13T21:07:08.917590123Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:08.924683 containerd[1650]: time="2025-01-13T21:07:08.918110670Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.2\" with image id \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\", repo tag \"quay.io/tigera/operator:v1.36.2\", repo digest \"quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764\", size \"21758492\" in 2.397253014s" Jan 13 21:07:08.924683 containerd[1650]: time="2025-01-13T21:07:08.918131111Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Jan 13 21:07:08.952257 containerd[1650]: time="2025-01-13T21:07:08.952152431Z" level=info msg="CreateContainer within sandbox \"74ee9a9dede4e0df9b17a7a35bf8f3aba1e9de0bfab67394f9f8e9cbe75e0942\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 13 21:07:08.963607 containerd[1650]: time="2025-01-13T21:07:08.963577931Z" level=info msg="CreateContainer within sandbox \"74ee9a9dede4e0df9b17a7a35bf8f3aba1e9de0bfab67394f9f8e9cbe75e0942\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"afa3864623bdc4fe67e99f0b82abf422998ef47cf5425245915cef90a805301a\"" Jan 13 21:07:08.964249 containerd[1650]: time="2025-01-13T21:07:08.964220383Z" level=info msg="StartContainer for \"afa3864623bdc4fe67e99f0b82abf422998ef47cf5425245915cef90a805301a\"" Jan 13 21:07:09.022497 containerd[1650]: time="2025-01-13T21:07:09.022467573Z" level=info msg="StartContainer for \"afa3864623bdc4fe67e99f0b82abf422998ef47cf5425245915cef90a805301a\" returns successfully" Jan 13 21:07:09.549900 kubelet[3044]: I0113 21:07:09.549543 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="tigera-operator/tigera-operator-c7ccbd65-qx4n4" podStartSLOduration=1.135807991 podStartE2EDuration="3.549509658s" podCreationTimestamp="2025-01-13 21:07:06 +0000 UTC" firstStartedPulling="2025-01-13 21:07:06.517371903 +0000 UTC m=+14.212920390" lastFinishedPulling="2025-01-13 21:07:08.931073565 +0000 UTC m=+16.626622057" observedRunningTime="2025-01-13 21:07:09.549429176 +0000 UTC m=+17.244977681" watchObservedRunningTime="2025-01-13 21:07:09.549509658 +0000 UTC m=+17.245058163" Jan 13 21:07:11.919299 kubelet[3044]: I0113 21:07:11.919236 3044 topology_manager.go:215] "Topology Admit Handler" podUID="ad440742-0970-4ac6-850a-96bde2b61f56" podNamespace="calico-system" podName="calico-typha-6477cccdb9-fnhf8" Jan 13 21:07:11.996082 kubelet[3044]: I0113 21:07:11.996059 3044 topology_manager.go:215] "Topology Admit Handler" podUID="1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292" podNamespace="calico-system" podName="calico-node-sqrpt" Jan 13 21:07:12.067464 kubelet[3044]: I0113 21:07:12.067446 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad440742-0970-4ac6-850a-96bde2b61f56-tigera-ca-bundle\") pod \"calico-typha-6477cccdb9-fnhf8\" (UID: \"ad440742-0970-4ac6-850a-96bde2b61f56\") " pod="calico-system/calico-typha-6477cccdb9-fnhf8" Jan 13 21:07:12.068142 kubelet[3044]: I0113 21:07:12.067531 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ad440742-0970-4ac6-850a-96bde2b61f56-typha-certs\") pod \"calico-typha-6477cccdb9-fnhf8\" (UID: \"ad440742-0970-4ac6-850a-96bde2b61f56\") " pod="calico-system/calico-typha-6477cccdb9-fnhf8" Jan 13 21:07:12.068142 kubelet[3044]: I0113 21:07:12.067547 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh6fn\" (UniqueName: \"kubernetes.io/projected/ad440742-0970-4ac6-850a-96bde2b61f56-kube-api-access-vh6fn\") pod \"calico-typha-6477cccdb9-fnhf8\" (UID: \"ad440742-0970-4ac6-850a-96bde2b61f56\") " pod="calico-system/calico-typha-6477cccdb9-fnhf8" Jan 13 21:07:12.107653 kubelet[3044]: I0113 21:07:12.106812 3044 topology_manager.go:215] "Topology Admit Handler" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" podNamespace="calico-system" podName="csi-node-driver-x4zz7" Jan 13 21:07:12.109094 kubelet[3044]: E0113 21:07:12.108496 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:12.169138 kubelet[3044]: I0113 21:07:12.169116 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-flexvol-driver-host\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169228 kubelet[3044]: I0113 21:07:12.169149 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-var-run-calico\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169228 kubelet[3044]: I0113 21:07:12.169162 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-cni-bin-dir\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169228 kubelet[3044]: I0113 21:07:12.169173 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-cni-log-dir\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169228 kubelet[3044]: I0113 21:07:12.169185 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thwvx\" (UniqueName: \"kubernetes.io/projected/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-kube-api-access-thwvx\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169228 kubelet[3044]: I0113 21:07:12.169197 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-tigera-ca-bundle\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169337 kubelet[3044]: I0113 21:07:12.169208 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-xtables-lock\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169337 kubelet[3044]: I0113 21:07:12.169220 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-cni-net-dir\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169337 kubelet[3044]: I0113 21:07:12.169240 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-var-lib-calico\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169337 kubelet[3044]: I0113 21:07:12.169251 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-lib-modules\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169337 kubelet[3044]: I0113 21:07:12.169261 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-policysync\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.169428 kubelet[3044]: I0113 21:07:12.169273 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292-node-certs\") pod \"calico-node-sqrpt\" (UID: \"1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292\") " pod="calico-system/calico-node-sqrpt" Jan 13 21:07:12.226227 containerd[1650]: time="2025-01-13T21:07:12.226199806Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6477cccdb9-fnhf8,Uid:ad440742-0970-4ac6-850a-96bde2b61f56,Namespace:calico-system,Attempt:0,}" Jan 13 21:07:12.240190 containerd[1650]: time="2025-01-13T21:07:12.239959674Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:12.240190 containerd[1650]: time="2025-01-13T21:07:12.239991485Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:12.240190 containerd[1650]: time="2025-01-13T21:07:12.239999488Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:12.240190 containerd[1650]: time="2025-01-13T21:07:12.240044252Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:12.270203 kubelet[3044]: I0113 21:07:12.269527 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/59df9a6a-247e-477f-9e6c-e3512a44f477-varrun\") pod \"csi-node-driver-x4zz7\" (UID: \"59df9a6a-247e-477f-9e6c-e3512a44f477\") " pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:12.270203 kubelet[3044]: I0113 21:07:12.269555 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/59df9a6a-247e-477f-9e6c-e3512a44f477-socket-dir\") pod \"csi-node-driver-x4zz7\" (UID: \"59df9a6a-247e-477f-9e6c-e3512a44f477\") " pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:12.270203 kubelet[3044]: I0113 21:07:12.269609 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/59df9a6a-247e-477f-9e6c-e3512a44f477-kubelet-dir\") pod \"csi-node-driver-x4zz7\" (UID: \"59df9a6a-247e-477f-9e6c-e3512a44f477\") " pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:12.270203 kubelet[3044]: I0113 21:07:12.269623 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/59df9a6a-247e-477f-9e6c-e3512a44f477-registration-dir\") pod \"csi-node-driver-x4zz7\" (UID: \"59df9a6a-247e-477f-9e6c-e3512a44f477\") " pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:12.270203 kubelet[3044]: I0113 21:07:12.269652 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhmmc\" (UniqueName: \"kubernetes.io/projected/59df9a6a-247e-477f-9e6c-e3512a44f477-kube-api-access-xhmmc\") pod \"csi-node-driver-x4zz7\" (UID: \"59df9a6a-247e-477f-9e6c-e3512a44f477\") " pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:12.282447 kubelet[3044]: E0113 21:07:12.282426 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.282447 kubelet[3044]: W0113 21:07:12.282440 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.294510 kubelet[3044]: E0113 21:07:12.282481 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.310195 containerd[1650]: time="2025-01-13T21:07:12.310163293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sqrpt,Uid:1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292,Namespace:calico-system,Attempt:0,}" Jan 13 21:07:12.313908 containerd[1650]: time="2025-01-13T21:07:12.313886386Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6477cccdb9-fnhf8,Uid:ad440742-0970-4ac6-850a-96bde2b61f56,Namespace:calico-system,Attempt:0,} returns sandbox id \"d950dae2e59197a7a46c97978e2e1fa1c15b7d8001d36e2a9c4dc01de7c806b9\"" Jan 13 21:07:12.314766 containerd[1650]: time="2025-01-13T21:07:12.314748566Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Jan 13 21:07:12.370092 kubelet[3044]: E0113 21:07:12.370073 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.370092 kubelet[3044]: W0113 21:07:12.370088 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.370306 kubelet[3044]: E0113 21:07:12.370103 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.370306 kubelet[3044]: E0113 21:07:12.370217 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.370306 kubelet[3044]: W0113 21:07:12.370222 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.370306 kubelet[3044]: E0113 21:07:12.370233 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.370493 kubelet[3044]: E0113 21:07:12.370419 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.370493 kubelet[3044]: W0113 21:07:12.370426 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.370493 kubelet[3044]: E0113 21:07:12.370439 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.376712 kubelet[3044]: E0113 21:07:12.370611 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.376712 kubelet[3044]: W0113 21:07:12.370616 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.376712 kubelet[3044]: E0113 21:07:12.370643 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.376712 kubelet[3044]: E0113 21:07:12.370748 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.376712 kubelet[3044]: W0113 21:07:12.370752 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.376712 kubelet[3044]: E0113 21:07:12.370763 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.376712 kubelet[3044]: E0113 21:07:12.370869 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.376712 kubelet[3044]: W0113 21:07:12.370874 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.376712 kubelet[3044]: E0113 21:07:12.370885 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.376712 kubelet[3044]: E0113 21:07:12.370987 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.376897 kubelet[3044]: W0113 21:07:12.370992 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.376897 kubelet[3044]: E0113 21:07:12.371001 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.376897 kubelet[3044]: E0113 21:07:12.371104 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.376897 kubelet[3044]: W0113 21:07:12.371109 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.376897 kubelet[3044]: E0113 21:07:12.371117 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.376897 kubelet[3044]: E0113 21:07:12.371249 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.376897 kubelet[3044]: W0113 21:07:12.371254 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.376897 kubelet[3044]: E0113 21:07:12.371264 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.376897 kubelet[3044]: E0113 21:07:12.371368 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.376897 kubelet[3044]: W0113 21:07:12.371373 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.377105 kubelet[3044]: E0113 21:07:12.371385 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.377105 kubelet[3044]: E0113 21:07:12.371512 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.377105 kubelet[3044]: W0113 21:07:12.371517 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.377105 kubelet[3044]: E0113 21:07:12.371524 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.377105 kubelet[3044]: E0113 21:07:12.371619 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.377105 kubelet[3044]: W0113 21:07:12.371636 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.377105 kubelet[3044]: E0113 21:07:12.371646 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.377105 kubelet[3044]: E0113 21:07:12.371815 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.377105 kubelet[3044]: W0113 21:07:12.371820 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.377105 kubelet[3044]: E0113 21:07:12.371827 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.410956 kubelet[3044]: E0113 21:07:12.371927 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.410956 kubelet[3044]: W0113 21:07:12.371932 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.410956 kubelet[3044]: E0113 21:07:12.371938 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.410956 kubelet[3044]: E0113 21:07:12.372020 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.410956 kubelet[3044]: W0113 21:07:12.372025 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.410956 kubelet[3044]: E0113 21:07:12.372031 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.410956 kubelet[3044]: E0113 21:07:12.372141 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.410956 kubelet[3044]: W0113 21:07:12.372146 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.410956 kubelet[3044]: E0113 21:07:12.372152 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.410956 kubelet[3044]: E0113 21:07:12.372257 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.411161 kubelet[3044]: W0113 21:07:12.372261 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.411161 kubelet[3044]: E0113 21:07:12.372268 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.411161 kubelet[3044]: E0113 21:07:12.372356 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.411161 kubelet[3044]: W0113 21:07:12.372363 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.411161 kubelet[3044]: E0113 21:07:12.372371 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.411161 kubelet[3044]: E0113 21:07:12.372456 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.411161 kubelet[3044]: W0113 21:07:12.372460 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.411161 kubelet[3044]: E0113 21:07:12.372466 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.411161 kubelet[3044]: E0113 21:07:12.372551 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.411161 kubelet[3044]: W0113 21:07:12.372555 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.411362 kubelet[3044]: E0113 21:07:12.372561 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.411362 kubelet[3044]: E0113 21:07:12.372666 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.411362 kubelet[3044]: W0113 21:07:12.372670 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.411362 kubelet[3044]: E0113 21:07:12.372676 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.411362 kubelet[3044]: E0113 21:07:12.372770 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.411362 kubelet[3044]: W0113 21:07:12.372775 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.411362 kubelet[3044]: E0113 21:07:12.372780 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.411362 kubelet[3044]: E0113 21:07:12.372883 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.411362 kubelet[3044]: W0113 21:07:12.372888 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.411362 kubelet[3044]: E0113 21:07:12.372895 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.412068 kubelet[3044]: E0113 21:07:12.373000 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.412068 kubelet[3044]: W0113 21:07:12.373004 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.412068 kubelet[3044]: E0113 21:07:12.373009 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.412068 kubelet[3044]: E0113 21:07:12.377434 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.412068 kubelet[3044]: W0113 21:07:12.377441 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.412068 kubelet[3044]: E0113 21:07:12.377450 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.412694 kubelet[3044]: E0113 21:07:12.412682 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:12.412694 kubelet[3044]: W0113 21:07:12.412692 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:12.412820 kubelet[3044]: E0113 21:07:12.412705 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:12.456060 containerd[1650]: time="2025-01-13T21:07:12.455942293Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:12.456060 containerd[1650]: time="2025-01-13T21:07:12.455991745Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:12.456225 containerd[1650]: time="2025-01-13T21:07:12.456000543Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:12.456225 containerd[1650]: time="2025-01-13T21:07:12.456083294Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:12.484094 containerd[1650]: time="2025-01-13T21:07:12.484074015Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sqrpt,Uid:1b5ce3ce-8e52-45f6-b033-8ac9b6cf3292,Namespace:calico-system,Attempt:0,} returns sandbox id \"c16a2b495884583b49887e9a3180e791003d36fc751f451766c61d91fdb3041b\"" Jan 13 21:07:13.433818 kubelet[3044]: E0113 21:07:13.433788 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:13.906447 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3112415252.mount: Deactivated successfully. Jan 13 21:07:14.727823 containerd[1650]: time="2025-01-13T21:07:14.727620837Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:14.727823 containerd[1650]: time="2025-01-13T21:07:14.727735564Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.1: active requests=0, bytes read=31343363" Jan 13 21:07:14.728688 containerd[1650]: time="2025-01-13T21:07:14.728344984Z" level=info msg="ImageCreate event name:\"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:14.729428 containerd[1650]: time="2025-01-13T21:07:14.729397652Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:14.729875 containerd[1650]: time="2025-01-13T21:07:14.729760598Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.1\" with image id \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c\", size \"31343217\" in 2.414920274s" Jan 13 21:07:14.729875 containerd[1650]: time="2025-01-13T21:07:14.729778647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Jan 13 21:07:14.730266 containerd[1650]: time="2025-01-13T21:07:14.730253643Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Jan 13 21:07:14.738964 containerd[1650]: time="2025-01-13T21:07:14.738944845Z" level=info msg="CreateContainer within sandbox \"d950dae2e59197a7a46c97978e2e1fa1c15b7d8001d36e2a9c4dc01de7c806b9\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 13 21:07:14.744775 containerd[1650]: time="2025-01-13T21:07:14.744702602Z" level=info msg="CreateContainer within sandbox \"d950dae2e59197a7a46c97978e2e1fa1c15b7d8001d36e2a9c4dc01de7c806b9\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"68c8212d2da2f1fc8fedb86fc534f5a312b90febc0ab64b06891ac1b683f72cc\"" Jan 13 21:07:14.745266 containerd[1650]: time="2025-01-13T21:07:14.745253117Z" level=info msg="StartContainer for \"68c8212d2da2f1fc8fedb86fc534f5a312b90febc0ab64b06891ac1b683f72cc\"" Jan 13 21:07:14.795295 containerd[1650]: time="2025-01-13T21:07:14.795233468Z" level=info msg="StartContainer for \"68c8212d2da2f1fc8fedb86fc534f5a312b90febc0ab64b06891ac1b683f72cc\" returns successfully" Jan 13 21:07:15.433381 kubelet[3044]: E0113 21:07:15.433314 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:15.539586 kubelet[3044]: I0113 21:07:15.538509 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-typha-6477cccdb9-fnhf8" podStartSLOduration=2.123092332 podStartE2EDuration="4.538484056s" podCreationTimestamp="2025-01-13 21:07:11 +0000 UTC" firstStartedPulling="2025-01-13 21:07:12.314635434 +0000 UTC m=+20.010183923" lastFinishedPulling="2025-01-13 21:07:14.73002716 +0000 UTC m=+22.425575647" observedRunningTime="2025-01-13 21:07:15.538357742 +0000 UTC m=+23.233906238" watchObservedRunningTime="2025-01-13 21:07:15.538484056 +0000 UTC m=+23.234032547" Jan 13 21:07:15.602212 kubelet[3044]: E0113 21:07:15.602142 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.602212 kubelet[3044]: W0113 21:07:15.602155 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.602212 kubelet[3044]: E0113 21:07:15.602169 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.602368 kubelet[3044]: E0113 21:07:15.602272 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.602368 kubelet[3044]: W0113 21:07:15.602278 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.602368 kubelet[3044]: E0113 21:07:15.602285 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.602448 kubelet[3044]: E0113 21:07:15.602386 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.602448 kubelet[3044]: W0113 21:07:15.602390 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.602448 kubelet[3044]: E0113 21:07:15.602396 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.602765 kubelet[3044]: E0113 21:07:15.602488 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.602765 kubelet[3044]: W0113 21:07:15.602492 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.602765 kubelet[3044]: E0113 21:07:15.602498 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.602765 kubelet[3044]: E0113 21:07:15.602584 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.602765 kubelet[3044]: W0113 21:07:15.602588 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.602765 kubelet[3044]: E0113 21:07:15.602594 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.602765 kubelet[3044]: E0113 21:07:15.602696 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.602765 kubelet[3044]: W0113 21:07:15.602700 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.602765 kubelet[3044]: E0113 21:07:15.602706 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603035 kubelet[3044]: E0113 21:07:15.602787 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603035 kubelet[3044]: W0113 21:07:15.602792 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603035 kubelet[3044]: E0113 21:07:15.602800 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603035 kubelet[3044]: E0113 21:07:15.602885 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603035 kubelet[3044]: W0113 21:07:15.602889 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603035 kubelet[3044]: E0113 21:07:15.602895 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603035 kubelet[3044]: E0113 21:07:15.602972 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603035 kubelet[3044]: W0113 21:07:15.602976 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603035 kubelet[3044]: E0113 21:07:15.602981 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603478 kubelet[3044]: E0113 21:07:15.603056 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603478 kubelet[3044]: W0113 21:07:15.603060 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603478 kubelet[3044]: E0113 21:07:15.603065 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603478 kubelet[3044]: E0113 21:07:15.603140 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603478 kubelet[3044]: W0113 21:07:15.603144 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603478 kubelet[3044]: E0113 21:07:15.603150 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603478 kubelet[3044]: E0113 21:07:15.603238 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603478 kubelet[3044]: W0113 21:07:15.603242 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603478 kubelet[3044]: E0113 21:07:15.603248 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603478 kubelet[3044]: E0113 21:07:15.603331 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603683 kubelet[3044]: W0113 21:07:15.603336 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603683 kubelet[3044]: E0113 21:07:15.603343 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603683 kubelet[3044]: E0113 21:07:15.603446 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603683 kubelet[3044]: W0113 21:07:15.603450 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603683 kubelet[3044]: E0113 21:07:15.603456 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.603683 kubelet[3044]: E0113 21:07:15.603569 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.603683 kubelet[3044]: W0113 21:07:15.603574 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.603683 kubelet[3044]: E0113 21:07:15.603579 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.688756 kubelet[3044]: E0113 21:07:15.688070 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.688756 kubelet[3044]: W0113 21:07:15.688643 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.688756 kubelet[3044]: E0113 21:07:15.688678 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.688932 kubelet[3044]: E0113 21:07:15.688892 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.688932 kubelet[3044]: W0113 21:07:15.688907 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.688932 kubelet[3044]: E0113 21:07:15.688918 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.689067 kubelet[3044]: E0113 21:07:15.689058 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.689067 kubelet[3044]: W0113 21:07:15.689065 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.689117 kubelet[3044]: E0113 21:07:15.689073 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.689260 kubelet[3044]: E0113 21:07:15.689247 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.689260 kubelet[3044]: W0113 21:07:15.689253 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.689439 kubelet[3044]: E0113 21:07:15.689273 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.689439 kubelet[3044]: E0113 21:07:15.689422 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.689439 kubelet[3044]: W0113 21:07:15.689427 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.689627 kubelet[3044]: E0113 21:07:15.689457 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.689649 kubelet[3044]: E0113 21:07:15.689621 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.689649 kubelet[3044]: W0113 21:07:15.689633 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.689649 kubelet[3044]: E0113 21:07:15.689644 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.689920 kubelet[3044]: E0113 21:07:15.689733 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.689920 kubelet[3044]: W0113 21:07:15.689754 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.689920 kubelet[3044]: E0113 21:07:15.689820 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.690013 kubelet[3044]: E0113 21:07:15.690001 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.690013 kubelet[3044]: W0113 21:07:15.690008 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.690013 kubelet[3044]: E0113 21:07:15.690018 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.690139 kubelet[3044]: E0113 21:07:15.690128 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.690139 kubelet[3044]: W0113 21:07:15.690133 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.690209 kubelet[3044]: E0113 21:07:15.690197 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.690339 kubelet[3044]: E0113 21:07:15.690328 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.690645 kubelet[3044]: W0113 21:07:15.690336 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.690645 kubelet[3044]: E0113 21:07:15.690528 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.690645 kubelet[3044]: E0113 21:07:15.690569 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.690645 kubelet[3044]: W0113 21:07:15.690573 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.690645 kubelet[3044]: E0113 21:07:15.690583 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.690749 kubelet[3044]: E0113 21:07:15.690695 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.690749 kubelet[3044]: W0113 21:07:15.690699 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.690749 kubelet[3044]: E0113 21:07:15.690707 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.690898 kubelet[3044]: E0113 21:07:15.690891 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.690989 kubelet[3044]: W0113 21:07:15.690926 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.690989 kubelet[3044]: E0113 21:07:15.690940 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.691065 kubelet[3044]: E0113 21:07:15.691058 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.691099 kubelet[3044]: W0113 21:07:15.691094 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.691133 kubelet[3044]: E0113 21:07:15.691129 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.691237 kubelet[3044]: E0113 21:07:15.691227 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.691237 kubelet[3044]: W0113 21:07:15.691235 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.691280 kubelet[3044]: E0113 21:07:15.691246 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.691365 kubelet[3044]: E0113 21:07:15.691356 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.691365 kubelet[3044]: W0113 21:07:15.691363 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.691411 kubelet[3044]: E0113 21:07:15.691369 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.691475 kubelet[3044]: E0113 21:07:15.691455 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.691475 kubelet[3044]: W0113 21:07:15.691459 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.691475 kubelet[3044]: E0113 21:07:15.691466 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:15.691622 kubelet[3044]: E0113 21:07:15.691614 3044 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 13 21:07:15.691622 kubelet[3044]: W0113 21:07:15.691620 3044 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 13 21:07:15.691662 kubelet[3044]: E0113 21:07:15.691627 3044 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 13 21:07:16.397819 containerd[1650]: time="2025-01-13T21:07:16.397655970Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:16.398068 containerd[1650]: time="2025-01-13T21:07:16.398020490Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1: active requests=0, bytes read=5362121" Jan 13 21:07:16.398417 containerd[1650]: time="2025-01-13T21:07:16.398395881Z" level=info msg="ImageCreate event name:\"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:16.400630 containerd[1650]: time="2025-01-13T21:07:16.400613297Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:16.401464 containerd[1650]: time="2025-01-13T21:07:16.400883280Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" with image id \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c\", size \"6855165\" in 1.670614064s" Jan 13 21:07:16.401464 containerd[1650]: time="2025-01-13T21:07:16.400899849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Jan 13 21:07:16.403459 containerd[1650]: time="2025-01-13T21:07:16.403369749Z" level=info msg="CreateContainer within sandbox \"c16a2b495884583b49887e9a3180e791003d36fc751f451766c61d91fdb3041b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 13 21:07:16.453506 containerd[1650]: time="2025-01-13T21:07:16.453459987Z" level=info msg="CreateContainer within sandbox \"c16a2b495884583b49887e9a3180e791003d36fc751f451766c61d91fdb3041b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a0ab03641ebc9e125206783a6a02f0878fb50d527b6c1abbffab941369593ed9\"" Jan 13 21:07:16.453824 containerd[1650]: time="2025-01-13T21:07:16.453673819Z" level=info msg="StartContainer for \"a0ab03641ebc9e125206783a6a02f0878fb50d527b6c1abbffab941369593ed9\"" Jan 13 21:07:16.503325 containerd[1650]: time="2025-01-13T21:07:16.503297724Z" level=info msg="StartContainer for \"a0ab03641ebc9e125206783a6a02f0878fb50d527b6c1abbffab941369593ed9\" returns successfully" Jan 13 21:07:16.523941 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a0ab03641ebc9e125206783a6a02f0878fb50d527b6c1abbffab941369593ed9-rootfs.mount: Deactivated successfully. Jan 13 21:07:16.535011 kubelet[3044]: I0113 21:07:16.534840 3044 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 21:07:16.768607 containerd[1650]: time="2025-01-13T21:07:16.768555566Z" level=info msg="shim disconnected" id=a0ab03641ebc9e125206783a6a02f0878fb50d527b6c1abbffab941369593ed9 namespace=k8s.io Jan 13 21:07:16.768607 containerd[1650]: time="2025-01-13T21:07:16.768603620Z" level=warning msg="cleaning up after shim disconnected" id=a0ab03641ebc9e125206783a6a02f0878fb50d527b6c1abbffab941369593ed9 namespace=k8s.io Jan 13 21:07:16.768607 containerd[1650]: time="2025-01-13T21:07:16.768611083Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 21:07:17.433789 kubelet[3044]: E0113 21:07:17.433767 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:17.538736 containerd[1650]: time="2025-01-13T21:07:17.538604719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Jan 13 21:07:19.434072 kubelet[3044]: E0113 21:07:19.434029 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:21.434157 kubelet[3044]: E0113 21:07:21.434136 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:22.019546 containerd[1650]: time="2025-01-13T21:07:22.019492458Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:22.020969 containerd[1650]: time="2025-01-13T21:07:22.020777043Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.1: active requests=0, bytes read=96154154" Jan 13 21:07:22.021580 containerd[1650]: time="2025-01-13T21:07:22.021191117Z" level=info msg="ImageCreate event name:\"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:22.023431 containerd[1650]: time="2025-01-13T21:07:22.023389897Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:22.024007 containerd[1650]: time="2025-01-13T21:07:22.023985823Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.1\" with image id \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b\", size \"97647238\" in 4.485355195s" Jan 13 21:07:22.024140 containerd[1650]: time="2025-01-13T21:07:22.024007810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Jan 13 21:07:22.026365 containerd[1650]: time="2025-01-13T21:07:22.026344424Z" level=info msg="CreateContainer within sandbox \"c16a2b495884583b49887e9a3180e791003d36fc751f451766c61d91fdb3041b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 13 21:07:22.034977 containerd[1650]: time="2025-01-13T21:07:22.034951154Z" level=info msg="CreateContainer within sandbox \"c16a2b495884583b49887e9a3180e791003d36fc751f451766c61d91fdb3041b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"42785c568f6c826e1cab9159d26e1d434c6286ffeef49714ead4e5f271b9440d\"" Jan 13 21:07:22.035795 containerd[1650]: time="2025-01-13T21:07:22.035252866Z" level=info msg="StartContainer for \"42785c568f6c826e1cab9159d26e1d434c6286ffeef49714ead4e5f271b9440d\"" Jan 13 21:07:22.101196 containerd[1650]: time="2025-01-13T21:07:22.101174986Z" level=info msg="StartContainer for \"42785c568f6c826e1cab9159d26e1d434c6286ffeef49714ead4e5f271b9440d\" returns successfully" Jan 13 21:07:23.413665 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-42785c568f6c826e1cab9159d26e1d434c6286ffeef49714ead4e5f271b9440d-rootfs.mount: Deactivated successfully. Jan 13 21:07:23.416180 containerd[1650]: time="2025-01-13T21:07:23.415242407Z" level=info msg="shim disconnected" id=42785c568f6c826e1cab9159d26e1d434c6286ffeef49714ead4e5f271b9440d namespace=k8s.io Jan 13 21:07:23.416180 containerd[1650]: time="2025-01-13T21:07:23.415277281Z" level=warning msg="cleaning up after shim disconnected" id=42785c568f6c826e1cab9159d26e1d434c6286ffeef49714ead4e5f271b9440d namespace=k8s.io Jan 13 21:07:23.416180 containerd[1650]: time="2025-01-13T21:07:23.415287479Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jan 13 21:07:23.433409 kubelet[3044]: E0113 21:07:23.433383 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:23.468202 kubelet[3044]: I0113 21:07:23.468183 3044 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Jan 13 21:07:23.489345 kubelet[3044]: I0113 21:07:23.489317 3044 topology_manager.go:215] "Topology Admit Handler" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" podNamespace="kube-system" podName="coredns-76f75df574-bbgth" Jan 13 21:07:23.492254 kubelet[3044]: I0113 21:07:23.492232 3044 topology_manager.go:215] "Topology Admit Handler" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" podNamespace="kube-system" podName="coredns-76f75df574-bhkcp" Jan 13 21:07:23.498358 kubelet[3044]: I0113 21:07:23.498334 3044 topology_manager.go:215] "Topology Admit Handler" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" podNamespace="calico-system" podName="calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:23.499433 kubelet[3044]: I0113 21:07:23.498451 3044 topology_manager.go:215] "Topology Admit Handler" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" podNamespace="calico-apiserver" podName="calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:23.499433 kubelet[3044]: I0113 21:07:23.499313 3044 topology_manager.go:215] "Topology Admit Handler" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" podNamespace="calico-apiserver" podName="calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:23.547720 containerd[1650]: time="2025-01-13T21:07:23.547700588Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Jan 13 21:07:23.644763 kubelet[3044]: I0113 21:07:23.644733 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/843fc70f-b236-4057-8b13-ac6f976b6914-config-volume\") pod \"coredns-76f75df574-bbgth\" (UID: \"843fc70f-b236-4057-8b13-ac6f976b6914\") " pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:23.644889 kubelet[3044]: I0113 21:07:23.644786 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdhfc\" (UniqueName: \"kubernetes.io/projected/27d57ea0-66c3-451f-b6eb-d7050a90a389-kube-api-access-kdhfc\") pod \"calico-apiserver-7b989c8495-9bjj8\" (UID: \"27d57ea0-66c3-451f-b6eb-d7050a90a389\") " pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:23.644889 kubelet[3044]: I0113 21:07:23.644828 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lglfg\" (UniqueName: \"kubernetes.io/projected/843fc70f-b236-4057-8b13-ac6f976b6914-kube-api-access-lglfg\") pod \"coredns-76f75df574-bbgth\" (UID: \"843fc70f-b236-4057-8b13-ac6f976b6914\") " pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:23.644889 kubelet[3044]: I0113 21:07:23.644866 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jgw2\" (UniqueName: \"kubernetes.io/projected/78856142-3d1a-4f19-a50a-e5f8505c2330-kube-api-access-8jgw2\") pod \"calico-kube-controllers-76db674964-8lv5h\" (UID: \"78856142-3d1a-4f19-a50a-e5f8505c2330\") " pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:23.644889 kubelet[3044]: I0113 21:07:23.644884 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjjlg\" (UniqueName: \"kubernetes.io/projected/000eb469-bf65-41e7-a410-d2b847f032c6-kube-api-access-bjjlg\") pod \"coredns-76f75df574-bhkcp\" (UID: \"000eb469-bf65-41e7-a410-d2b847f032c6\") " pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:23.644995 kubelet[3044]: I0113 21:07:23.644899 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdjbp\" (UniqueName: \"kubernetes.io/projected/ddbbefea-15ee-4654-983a-5f453cee91ee-kube-api-access-jdjbp\") pod \"calico-apiserver-7b989c8495-wsw2f\" (UID: \"ddbbefea-15ee-4654-983a-5f453cee91ee\") " pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:23.644995 kubelet[3044]: I0113 21:07:23.644915 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/000eb469-bf65-41e7-a410-d2b847f032c6-config-volume\") pod \"coredns-76f75df574-bhkcp\" (UID: \"000eb469-bf65-41e7-a410-d2b847f032c6\") " pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:23.644995 kubelet[3044]: I0113 21:07:23.644931 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ddbbefea-15ee-4654-983a-5f453cee91ee-calico-apiserver-certs\") pod \"calico-apiserver-7b989c8495-wsw2f\" (UID: \"ddbbefea-15ee-4654-983a-5f453cee91ee\") " pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:23.644995 kubelet[3044]: I0113 21:07:23.644947 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/27d57ea0-66c3-451f-b6eb-d7050a90a389-calico-apiserver-certs\") pod \"calico-apiserver-7b989c8495-9bjj8\" (UID: \"27d57ea0-66c3-451f-b6eb-d7050a90a389\") " pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:23.644995 kubelet[3044]: I0113 21:07:23.644971 3044 reconciler_common.go:258] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78856142-3d1a-4f19-a50a-e5f8505c2330-tigera-ca-bundle\") pod \"calico-kube-controllers-76db674964-8lv5h\" (UID: \"78856142-3d1a-4f19-a50a-e5f8505c2330\") " pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:23.809725 containerd[1650]: time="2025-01-13T21:07:23.809701114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:0,}" Jan 13 21:07:23.809725 containerd[1650]: time="2025-01-13T21:07:23.809749792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:0,}" Jan 13 21:07:23.810324 containerd[1650]: time="2025-01-13T21:07:23.810217974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:0,}" Jan 13 21:07:23.812925 containerd[1650]: time="2025-01-13T21:07:23.812890278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:0,}" Jan 13 21:07:23.816952 containerd[1650]: time="2025-01-13T21:07:23.816811798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:0,}" Jan 13 21:07:24.067265 containerd[1650]: time="2025-01-13T21:07:24.066776470Z" level=error msg="Failed to destroy network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.069988 containerd[1650]: time="2025-01-13T21:07:24.069913313Z" level=error msg="Failed to destroy network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.071695 containerd[1650]: time="2025-01-13T21:07:24.071634133Z" level=error msg="encountered an error cleaning up failed sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.071695 containerd[1650]: time="2025-01-13T21:07:24.071678661Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.072046 containerd[1650]: time="2025-01-13T21:07:24.071756151Z" level=error msg="encountered an error cleaning up failed sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.072046 containerd[1650]: time="2025-01-13T21:07:24.072014332Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.079817 containerd[1650]: time="2025-01-13T21:07:24.071826478Z" level=error msg="Failed to destroy network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.079993 containerd[1650]: time="2025-01-13T21:07:24.079980763Z" level=error msg="encountered an error cleaning up failed sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.080341 containerd[1650]: time="2025-01-13T21:07:24.080043570Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.080669 containerd[1650]: time="2025-01-13T21:07:24.080655493Z" level=error msg="Failed to destroy network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.080864 containerd[1650]: time="2025-01-13T21:07:24.080851075Z" level=error msg="encountered an error cleaning up failed sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.080915 containerd[1650]: time="2025-01-13T21:07:24.080904913Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081012 containerd[1650]: time="2025-01-13T21:07:24.081001503Z" level=error msg="Failed to destroy network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081595 containerd[1650]: time="2025-01-13T21:07:24.081161761Z" level=error msg="encountered an error cleaning up failed sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081595 containerd[1650]: time="2025-01-13T21:07:24.081183428Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081661 kubelet[3044]: E0113 21:07:24.081312 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081661 kubelet[3044]: E0113 21:07:24.081344 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081661 kubelet[3044]: E0113 21:07:24.081352 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:24.081661 kubelet[3044]: E0113 21:07:24.081365 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:24.081742 kubelet[3044]: E0113 21:07:24.081387 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:24.081742 kubelet[3044]: E0113 21:07:24.081403 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081742 kubelet[3044]: E0113 21:07:24.081418 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:24.081742 kubelet[3044]: E0113 21:07:24.081428 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:24.081818 kubelet[3044]: E0113 21:07:24.081431 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" Jan 13 21:07:24.081818 kubelet[3044]: E0113 21:07:24.081450 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bhkcp" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" Jan 13 21:07:24.081818 kubelet[3044]: E0113 21:07:24.081464 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.081912 kubelet[3044]: E0113 21:07:24.081476 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:24.081912 kubelet[3044]: E0113 21:07:24.081486 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:24.081912 kubelet[3044]: E0113 21:07:24.081512 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bbgth" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" Jan 13 21:07:24.082157 kubelet[3044]: E0113 21:07:24.081389 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:24.082157 kubelet[3044]: E0113 21:07:24.081540 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" Jan 13 21:07:24.082157 kubelet[3044]: E0113 21:07:24.082005 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.082231 kubelet[3044]: E0113 21:07:24.082030 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:24.082231 kubelet[3044]: E0113 21:07:24.082042 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:24.082231 kubelet[3044]: E0113 21:07:24.082065 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" Jan 13 21:07:24.548633 kubelet[3044]: I0113 21:07:24.548610 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35" Jan 13 21:07:24.550356 kubelet[3044]: I0113 21:07:24.550336 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6" Jan 13 21:07:24.587253 kubelet[3044]: I0113 21:07:24.586999 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb" Jan 13 21:07:24.588738 kubelet[3044]: I0113 21:07:24.588719 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3" Jan 13 21:07:24.590010 kubelet[3044]: I0113 21:07:24.589997 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7" Jan 13 21:07:24.620399 containerd[1650]: time="2025-01-13T21:07:24.620326301Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:24.620399 containerd[1650]: time="2025-01-13T21:07:24.620350443Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:24.621539 containerd[1650]: time="2025-01-13T21:07:24.621392500Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:24.624965 containerd[1650]: time="2025-01-13T21:07:24.624205393Z" level=info msg="Ensure that sandbox f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7 in task-service has been cleanup successfully" Jan 13 21:07:24.625743 containerd[1650]: time="2025-01-13T21:07:24.624736076Z" level=info msg="Ensure that sandbox b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35 in task-service has been cleanup successfully" Jan 13 21:07:24.625797 containerd[1650]: time="2025-01-13T21:07:24.625624235Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:24.625858 containerd[1650]: time="2025-01-13T21:07:24.625849973Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:24.625928 containerd[1650]: time="2025-01-13T21:07:24.625830393Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:24.625928 containerd[1650]: time="2025-01-13T21:07:24.625926047Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:24.625980 containerd[1650]: time="2025-01-13T21:07:24.620338557Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:24.626032 containerd[1650]: time="2025-01-13T21:07:24.626020788Z" level=info msg="Ensure that sandbox c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3 in task-service has been cleanup successfully" Jan 13 21:07:24.626106 containerd[1650]: time="2025-01-13T21:07:24.625696103Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:24.626249 containerd[1650]: time="2025-01-13T21:07:24.626125847Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:24.626249 containerd[1650]: time="2025-01-13T21:07:24.626246725Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:24.626749 containerd[1650]: time="2025-01-13T21:07:24.625683577Z" level=info msg="Ensure that sandbox a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb in task-service has been cleanup successfully" Jan 13 21:07:24.626749 containerd[1650]: time="2025-01-13T21:07:24.626346351Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:24.626749 containerd[1650]: time="2025-01-13T21:07:24.626353154Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:24.627192 systemd[1]: run-netns-cni\x2dfc874abb\x2da1eb\x2dd9c9\x2db6c4\x2d57558bc9e8d3.mount: Deactivated successfully. Jan 13 21:07:24.628884 containerd[1650]: time="2025-01-13T21:07:24.627220170Z" level=info msg="Ensure that sandbox 3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6 in task-service has been cleanup successfully" Jan 13 21:07:24.628884 containerd[1650]: time="2025-01-13T21:07:24.627909860Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:1,}" Jan 13 21:07:24.630970 containerd[1650]: time="2025-01-13T21:07:24.629191695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:1,}" Jan 13 21:07:24.630970 containerd[1650]: time="2025-01-13T21:07:24.629262135Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:24.630970 containerd[1650]: time="2025-01-13T21:07:24.630067226Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:24.630970 containerd[1650]: time="2025-01-13T21:07:24.629292942Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:1,}" Jan 13 21:07:24.630635 systemd[1]: run-netns-cni\x2d3e31c0d9\x2debd9\x2d0807\x2df68b\x2d714b13f7ca1a.mount: Deactivated successfully. Jan 13 21:07:24.632140 containerd[1650]: time="2025-01-13T21:07:24.631184296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:1,}" Jan 13 21:07:24.632140 containerd[1650]: time="2025-01-13T21:07:24.631558166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:1,}" Jan 13 21:07:24.630710 systemd[1]: run-netns-cni\x2ddaafbf48\x2dde85\x2de4cb\x2dce10\x2d6f18bdd1e55b.mount: Deactivated successfully. Jan 13 21:07:24.630763 systemd[1]: run-netns-cni\x2d92f8d6a3\x2d866a\x2d94b4\x2d7fe9\x2d0664c898ee1e.mount: Deactivated successfully. Jan 13 21:07:24.634472 systemd[1]: run-netns-cni\x2d2a3f5409\x2d6409\x2deec2\x2db054\x2d6b66ef82dba0.mount: Deactivated successfully. Jan 13 21:07:24.710370 containerd[1650]: time="2025-01-13T21:07:24.709978804Z" level=error msg="Failed to destroy network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.710370 containerd[1650]: time="2025-01-13T21:07:24.710190401Z" level=error msg="encountered an error cleaning up failed sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.710370 containerd[1650]: time="2025-01-13T21:07:24.710235428Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.710504 kubelet[3044]: E0113 21:07:24.710468 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.710504 kubelet[3044]: E0113 21:07:24.710500 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:24.710563 kubelet[3044]: E0113 21:07:24.710513 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:24.710563 kubelet[3044]: E0113 21:07:24.710549 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" Jan 13 21:07:24.730368 containerd[1650]: time="2025-01-13T21:07:24.730338772Z" level=error msg="Failed to destroy network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.730645 containerd[1650]: time="2025-01-13T21:07:24.730624292Z" level=error msg="encountered an error cleaning up failed sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.730677 containerd[1650]: time="2025-01-13T21:07:24.730667284Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:1,} failed, error" error="failed to setup network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.730949 kubelet[3044]: E0113 21:07:24.730870 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.730949 kubelet[3044]: E0113 21:07:24.730912 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:24.730949 kubelet[3044]: E0113 21:07:24.730940 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:24.731082 kubelet[3044]: E0113 21:07:24.730978 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" Jan 13 21:07:24.736714 containerd[1650]: time="2025-01-13T21:07:24.736684727Z" level=error msg="Failed to destroy network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.736822 containerd[1650]: time="2025-01-13T21:07:24.736795439Z" level=error msg="Failed to destroy network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.737079 containerd[1650]: time="2025-01-13T21:07:24.737060813Z" level=error msg="encountered an error cleaning up failed sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.737122 containerd[1650]: time="2025-01-13T21:07:24.737102544Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.737260 kubelet[3044]: E0113 21:07:24.737243 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.737299 kubelet[3044]: E0113 21:07:24.737275 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:24.737299 kubelet[3044]: E0113 21:07:24.737294 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:24.738290 kubelet[3044]: E0113 21:07:24.737336 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bhkcp" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" Jan 13 21:07:24.738290 kubelet[3044]: E0113 21:07:24.737677 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.738290 kubelet[3044]: E0113 21:07:24.737715 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:24.738408 containerd[1650]: time="2025-01-13T21:07:24.737553257Z" level=error msg="encountered an error cleaning up failed sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.738408 containerd[1650]: time="2025-01-13T21:07:24.737580534Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.738684 kubelet[3044]: E0113 21:07:24.737730 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:24.738684 kubelet[3044]: E0113 21:07:24.737757 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" Jan 13 21:07:24.744285 containerd[1650]: time="2025-01-13T21:07:24.744253815Z" level=error msg="Failed to destroy network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.744492 containerd[1650]: time="2025-01-13T21:07:24.744473880Z" level=error msg="encountered an error cleaning up failed sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.744524 containerd[1650]: time="2025-01-13T21:07:24.744509308Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.744652 kubelet[3044]: E0113 21:07:24.744633 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:24.745194 kubelet[3044]: E0113 21:07:24.744718 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:24.745194 kubelet[3044]: E0113 21:07:24.744737 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:24.745194 kubelet[3044]: E0113 21:07:24.744770 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bbgth" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" Jan 13 21:07:25.414914 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f-shm.mount: Deactivated successfully. Jan 13 21:07:25.435552 containerd[1650]: time="2025-01-13T21:07:25.435504673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:0,}" Jan 13 21:07:25.478415 containerd[1650]: time="2025-01-13T21:07:25.478366352Z" level=error msg="Failed to destroy network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.479070 containerd[1650]: time="2025-01-13T21:07:25.479050599Z" level=error msg="encountered an error cleaning up failed sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.479119 containerd[1650]: time="2025-01-13T21:07:25.479085616Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.479620 kubelet[3044]: E0113 21:07:25.479240 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.479620 kubelet[3044]: E0113 21:07:25.479273 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:25.480560 kubelet[3044]: E0113 21:07:25.479708 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:25.480560 kubelet[3044]: E0113 21:07:25.479764 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:25.481293 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf-shm.mount: Deactivated successfully. Jan 13 21:07:25.592334 kubelet[3044]: I0113 21:07:25.592257 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05" Jan 13 21:07:25.593171 containerd[1650]: time="2025-01-13T21:07:25.592977730Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:25.593171 containerd[1650]: time="2025-01-13T21:07:25.593116797Z" level=info msg="Ensure that sandbox 8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05 in task-service has been cleanup successfully" Jan 13 21:07:25.595090 kubelet[3044]: I0113 21:07:25.594271 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf" Jan 13 21:07:25.595231 containerd[1650]: time="2025-01-13T21:07:25.594118618Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:25.595231 containerd[1650]: time="2025-01-13T21:07:25.594313956Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:25.595491 containerd[1650]: time="2025-01-13T21:07:25.595376168Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:25.595551 containerd[1650]: time="2025-01-13T21:07:25.595541548Z" level=info msg="Ensure that sandbox 35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf in task-service has been cleanup successfully" Jan 13 21:07:25.595700 containerd[1650]: time="2025-01-13T21:07:25.595690538Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:25.595812 containerd[1650]: time="2025-01-13T21:07:25.595734545Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:25.595957 containerd[1650]: time="2025-01-13T21:07:25.595945761Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:25.596896 containerd[1650]: time="2025-01-13T21:07:25.596856493Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:25.596155 systemd[1]: run-netns-cni\x2d6b24dbcb\x2d4b2f\x2dc97e\x2d33da\x2d1b6a207ad283.mount: Deactivated successfully. Jan 13 21:07:25.597051 containerd[1650]: time="2025-01-13T21:07:25.596981160Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:25.598798 containerd[1650]: time="2025-01-13T21:07:25.598783464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:2,}" Jan 13 21:07:25.599291 containerd[1650]: time="2025-01-13T21:07:25.599091451Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:1,}" Jan 13 21:07:25.599328 systemd[1]: run-netns-cni\x2dd7027a1c\x2de5e0\x2d78d8\x2d2ea9\x2d784c32e307ec.mount: Deactivated successfully. Jan 13 21:07:25.600827 kubelet[3044]: I0113 21:07:25.600816 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b" Jan 13 21:07:25.601251 containerd[1650]: time="2025-01-13T21:07:25.601163612Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:25.602626 containerd[1650]: time="2025-01-13T21:07:25.602574653Z" level=info msg="Ensure that sandbox 59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b in task-service has been cleanup successfully" Jan 13 21:07:25.602813 containerd[1650]: time="2025-01-13T21:07:25.602784228Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:25.602813 containerd[1650]: time="2025-01-13T21:07:25.602794736Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:25.603756 containerd[1650]: time="2025-01-13T21:07:25.603680778Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:25.603756 containerd[1650]: time="2025-01-13T21:07:25.603724991Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:25.603756 containerd[1650]: time="2025-01-13T21:07:25.603731642Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:25.604301 containerd[1650]: time="2025-01-13T21:07:25.604158141Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:2,}" Jan 13 21:07:25.605004 kubelet[3044]: I0113 21:07:25.604976 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f" Jan 13 21:07:25.605309 containerd[1650]: time="2025-01-13T21:07:25.605143660Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:25.605309 containerd[1650]: time="2025-01-13T21:07:25.605241606Z" level=info msg="Ensure that sandbox aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f in task-service has been cleanup successfully" Jan 13 21:07:25.605406 containerd[1650]: time="2025-01-13T21:07:25.605395562Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:25.605468 containerd[1650]: time="2025-01-13T21:07:25.605439061Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:25.605768 containerd[1650]: time="2025-01-13T21:07:25.605680574Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:25.605768 containerd[1650]: time="2025-01-13T21:07:25.605727143Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:25.605768 containerd[1650]: time="2025-01-13T21:07:25.605733037Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:25.606152 kubelet[3044]: I0113 21:07:25.606141 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9" Jan 13 21:07:25.606469 containerd[1650]: time="2025-01-13T21:07:25.606321006Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:25.606469 containerd[1650]: time="2025-01-13T21:07:25.606410188Z" level=info msg="Ensure that sandbox 57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9 in task-service has been cleanup successfully" Jan 13 21:07:25.606569 containerd[1650]: time="2025-01-13T21:07:25.606559921Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:25.606821 containerd[1650]: time="2025-01-13T21:07:25.606594677Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:25.606821 containerd[1650]: time="2025-01-13T21:07:25.606727302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:2,}" Jan 13 21:07:25.607589 containerd[1650]: time="2025-01-13T21:07:25.607573592Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:25.607632 containerd[1650]: time="2025-01-13T21:07:25.607621458Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:25.607654 containerd[1650]: time="2025-01-13T21:07:25.607630799Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:25.608486 containerd[1650]: time="2025-01-13T21:07:25.608463657Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:2,}" Jan 13 21:07:25.608843 kubelet[3044]: I0113 21:07:25.608833 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d" Jan 13 21:07:25.609186 containerd[1650]: time="2025-01-13T21:07:25.609137734Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:25.609413 containerd[1650]: time="2025-01-13T21:07:25.609391869Z" level=info msg="Ensure that sandbox 4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d in task-service has been cleanup successfully" Jan 13 21:07:25.609514 containerd[1650]: time="2025-01-13T21:07:25.609502289Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:25.609514 containerd[1650]: time="2025-01-13T21:07:25.609512177Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:25.609690 containerd[1650]: time="2025-01-13T21:07:25.609662231Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:25.609737 containerd[1650]: time="2025-01-13T21:07:25.609726571Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:25.609737 containerd[1650]: time="2025-01-13T21:07:25.609734165Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:25.609975 containerd[1650]: time="2025-01-13T21:07:25.609952511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:2,}" Jan 13 21:07:25.757359 containerd[1650]: time="2025-01-13T21:07:25.757327424Z" level=error msg="Failed to destroy network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.757613 containerd[1650]: time="2025-01-13T21:07:25.757556671Z" level=error msg="encountered an error cleaning up failed sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.757613 containerd[1650]: time="2025-01-13T21:07:25.757596866Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.757787 kubelet[3044]: E0113 21:07:25.757774 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.757948 kubelet[3044]: E0113 21:07:25.757911 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:25.757948 kubelet[3044]: E0113 21:07:25.757932 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:25.758058 kubelet[3044]: E0113 21:07:25.758029 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bhkcp" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" Jan 13 21:07:25.763587 containerd[1650]: time="2025-01-13T21:07:25.763437591Z" level=error msg="Failed to destroy network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.763914 containerd[1650]: time="2025-01-13T21:07:25.763692224Z" level=error msg="encountered an error cleaning up failed sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.763914 containerd[1650]: time="2025-01-13T21:07:25.763725232Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.763998 kubelet[3044]: E0113 21:07:25.763911 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.763998 kubelet[3044]: E0113 21:07:25.763943 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:25.763998 kubelet[3044]: E0113 21:07:25.763958 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:25.764069 kubelet[3044]: E0113 21:07:25.763992 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" Jan 13 21:07:25.787593 containerd[1650]: time="2025-01-13T21:07:25.787555329Z" level=error msg="Failed to destroy network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.787852 containerd[1650]: time="2025-01-13T21:07:25.787834200Z" level=error msg="encountered an error cleaning up failed sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.788495 containerd[1650]: time="2025-01-13T21:07:25.787877363Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.788663 kubelet[3044]: E0113 21:07:25.788650 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.790544 kubelet[3044]: E0113 21:07:25.789969 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:25.790544 kubelet[3044]: E0113 21:07:25.789992 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:25.790544 kubelet[3044]: E0113 21:07:25.790029 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" Jan 13 21:07:25.793947 containerd[1650]: time="2025-01-13T21:07:25.793918088Z" level=error msg="Failed to destroy network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.794969 containerd[1650]: time="2025-01-13T21:07:25.794124418Z" level=error msg="encountered an error cleaning up failed sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.794969 containerd[1650]: time="2025-01-13T21:07:25.794161904Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:1,} failed, error" error="failed to setup network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.795034 kubelet[3044]: E0113 21:07:25.794289 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.795034 kubelet[3044]: E0113 21:07:25.794320 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:25.795034 kubelet[3044]: E0113 21:07:25.794334 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:25.795102 kubelet[3044]: E0113 21:07:25.794367 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:25.798251 containerd[1650]: time="2025-01-13T21:07:25.798216080Z" level=error msg="Failed to destroy network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.799295 containerd[1650]: time="2025-01-13T21:07:25.798445319Z" level=error msg="encountered an error cleaning up failed sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.799295 containerd[1650]: time="2025-01-13T21:07:25.798494870Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:2,} failed, error" error="failed to setup network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.799382 kubelet[3044]: E0113 21:07:25.798643 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.799382 kubelet[3044]: E0113 21:07:25.798671 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:25.799382 kubelet[3044]: E0113 21:07:25.798685 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:25.799451 kubelet[3044]: E0113 21:07:25.798734 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" Jan 13 21:07:25.803292 containerd[1650]: time="2025-01-13T21:07:25.803261407Z" level=error msg="Failed to destroy network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.803706 containerd[1650]: time="2025-01-13T21:07:25.803600064Z" level=error msg="encountered an error cleaning up failed sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.803832 containerd[1650]: time="2025-01-13T21:07:25.803752039Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.803958 kubelet[3044]: E0113 21:07:25.803940 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:25.804009 kubelet[3044]: E0113 21:07:25.803997 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:25.804062 kubelet[3044]: E0113 21:07:25.804050 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:25.804104 kubelet[3044]: E0113 21:07:25.804091 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bbgth" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" Jan 13 21:07:26.416053 systemd[1]: run-netns-cni\x2d6c79b8d1\x2d985b\x2df5f6\x2db2b9\x2da7299cfe72cb.mount: Deactivated successfully. Jan 13 21:07:26.416342 systemd[1]: run-netns-cni\x2db9f42a4c\x2d2359\x2db274\x2dd4c6\x2d6de9d318b320.mount: Deactivated successfully. Jan 13 21:07:26.416539 systemd[1]: run-netns-cni\x2d454fbd9b\x2db212\x2db1bd\x2d6aa0\x2dfc10f4647785.mount: Deactivated successfully. Jan 13 21:07:26.416647 systemd[1]: run-netns-cni\x2d26076538\x2da2db\x2d53e0\x2dd814\x2de83b9b3f6911.mount: Deactivated successfully. Jan 13 21:07:26.610893 kubelet[3044]: I0113 21:07:26.610869 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998" Jan 13 21:07:26.611418 containerd[1650]: time="2025-01-13T21:07:26.611320388Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:26.612310 containerd[1650]: time="2025-01-13T21:07:26.612108616Z" level=info msg="Ensure that sandbox 1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998 in task-service has been cleanup successfully" Jan 13 21:07:26.612687 containerd[1650]: time="2025-01-13T21:07:26.612647947Z" level=info msg="TearDown network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" successfully" Jan 13 21:07:26.612687 containerd[1650]: time="2025-01-13T21:07:26.612662018Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" returns successfully" Jan 13 21:07:26.614025 containerd[1650]: time="2025-01-13T21:07:26.614007197Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:26.614076 containerd[1650]: time="2025-01-13T21:07:26.614056665Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:26.614076 containerd[1650]: time="2025-01-13T21:07:26.614063966Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:26.614297 containerd[1650]: time="2025-01-13T21:07:26.614285022Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:26.614351 containerd[1650]: time="2025-01-13T21:07:26.614326594Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:26.614351 containerd[1650]: time="2025-01-13T21:07:26.614335523Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:26.614541 kubelet[3044]: I0113 21:07:26.614520 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4" Jan 13 21:07:26.615494 systemd[1]: run-netns-cni\x2d793efbe0\x2ddcae\x2dd4f7\x2dc8a2\x2d457ff77d99e0.mount: Deactivated successfully. Jan 13 21:07:26.615748 containerd[1650]: time="2025-01-13T21:07:26.615734281Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:26.616998 containerd[1650]: time="2025-01-13T21:07:26.616610132Z" level=info msg="Ensure that sandbox e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4 in task-service has been cleanup successfully" Jan 13 21:07:26.618925 containerd[1650]: time="2025-01-13T21:07:26.617219346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:3,}" Jan 13 21:07:26.618356 systemd[1]: run-netns-cni\x2d00675c87\x2d6e1a\x2d7f5b\x2d69ae\x2d62bda671f3d8.mount: Deactivated successfully. Jan 13 21:07:26.619106 containerd[1650]: time="2025-01-13T21:07:26.619064583Z" level=info msg="TearDown network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" successfully" Jan 13 21:07:26.619106 containerd[1650]: time="2025-01-13T21:07:26.619078688Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" returns successfully" Jan 13 21:07:26.619673 containerd[1650]: time="2025-01-13T21:07:26.619617696Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:26.619673 containerd[1650]: time="2025-01-13T21:07:26.619667580Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:26.619673 containerd[1650]: time="2025-01-13T21:07:26.619674174Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:26.621583 containerd[1650]: time="2025-01-13T21:07:26.621470523Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:26.621583 containerd[1650]: time="2025-01-13T21:07:26.621514694Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:26.621583 containerd[1650]: time="2025-01-13T21:07:26.621521205Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:26.623105 containerd[1650]: time="2025-01-13T21:07:26.622859375Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:3,}" Jan 13 21:07:26.623524 kubelet[3044]: I0113 21:07:26.623363 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a" Jan 13 21:07:26.624669 containerd[1650]: time="2025-01-13T21:07:26.624014247Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:26.625028 containerd[1650]: time="2025-01-13T21:07:26.624926414Z" level=info msg="Ensure that sandbox 12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a in task-service has been cleanup successfully" Jan 13 21:07:26.625710 containerd[1650]: time="2025-01-13T21:07:26.625694918Z" level=info msg="TearDown network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" successfully" Jan 13 21:07:26.625710 containerd[1650]: time="2025-01-13T21:07:26.625707288Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" returns successfully" Jan 13 21:07:26.630498 containerd[1650]: time="2025-01-13T21:07:26.630349331Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:26.630498 containerd[1650]: time="2025-01-13T21:07:26.630411208Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:26.630498 containerd[1650]: time="2025-01-13T21:07:26.630418384Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:26.630384 systemd[1]: run-netns-cni\x2dfa331cb9\x2dfa82\x2d0547\x2d5131\x2d0d75c02f0081.mount: Deactivated successfully. Jan 13 21:07:26.637047 containerd[1650]: time="2025-01-13T21:07:26.636411711Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:26.637047 containerd[1650]: time="2025-01-13T21:07:26.636490687Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:26.637047 containerd[1650]: time="2025-01-13T21:07:26.636499190Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:26.637187 kubelet[3044]: I0113 21:07:26.636780 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69" Jan 13 21:07:26.637762 containerd[1650]: time="2025-01-13T21:07:26.637749065Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:26.637972 containerd[1650]: time="2025-01-13T21:07:26.637961890Z" level=info msg="Ensure that sandbox 0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69 in task-service has been cleanup successfully" Jan 13 21:07:26.638401 containerd[1650]: time="2025-01-13T21:07:26.638217444Z" level=info msg="TearDown network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" successfully" Jan 13 21:07:26.638401 containerd[1650]: time="2025-01-13T21:07:26.638227156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:3,}" Jan 13 21:07:26.638401 containerd[1650]: time="2025-01-13T21:07:26.638228340Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" returns successfully" Jan 13 21:07:26.638831 containerd[1650]: time="2025-01-13T21:07:26.638795594Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:26.638967 containerd[1650]: time="2025-01-13T21:07:26.638958143Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:26.639023 containerd[1650]: time="2025-01-13T21:07:26.639014550Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:26.639777 containerd[1650]: time="2025-01-13T21:07:26.639359240Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:26.639777 containerd[1650]: time="2025-01-13T21:07:26.639419447Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:26.639777 containerd[1650]: time="2025-01-13T21:07:26.639428362Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:26.640312 containerd[1650]: time="2025-01-13T21:07:26.639867679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:3,}" Jan 13 21:07:26.640377 kubelet[3044]: I0113 21:07:26.640347 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b" Jan 13 21:07:26.640663 containerd[1650]: time="2025-01-13T21:07:26.640652467Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:26.640899 containerd[1650]: time="2025-01-13T21:07:26.640847835Z" level=info msg="Ensure that sandbox 50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b in task-service has been cleanup successfully" Jan 13 21:07:26.641380 containerd[1650]: time="2025-01-13T21:07:26.641022545Z" level=info msg="TearDown network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" successfully" Jan 13 21:07:26.641522 containerd[1650]: time="2025-01-13T21:07:26.641510008Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" returns successfully" Jan 13 21:07:26.642404 containerd[1650]: time="2025-01-13T21:07:26.642291128Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:26.642629 containerd[1650]: time="2025-01-13T21:07:26.642499214Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:26.642629 containerd[1650]: time="2025-01-13T21:07:26.642509225Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:26.643390 containerd[1650]: time="2025-01-13T21:07:26.643021191Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:26.643390 containerd[1650]: time="2025-01-13T21:07:26.643075837Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:26.643390 containerd[1650]: time="2025-01-13T21:07:26.643085197Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:26.644347 containerd[1650]: time="2025-01-13T21:07:26.644330184Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:3,}" Jan 13 21:07:26.646783 kubelet[3044]: I0113 21:07:26.646767 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6" Jan 13 21:07:26.647505 containerd[1650]: time="2025-01-13T21:07:26.647487617Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:26.648037 containerd[1650]: time="2025-01-13T21:07:26.647766589Z" level=info msg="Ensure that sandbox 78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6 in task-service has been cleanup successfully" Jan 13 21:07:26.648169 containerd[1650]: time="2025-01-13T21:07:26.648099650Z" level=info msg="TearDown network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" successfully" Jan 13 21:07:26.648169 containerd[1650]: time="2025-01-13T21:07:26.648110538Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" returns successfully" Jan 13 21:07:26.649356 containerd[1650]: time="2025-01-13T21:07:26.649285913Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:26.649356 containerd[1650]: time="2025-01-13T21:07:26.649332411Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:26.649356 containerd[1650]: time="2025-01-13T21:07:26.649339142Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:26.650104 containerd[1650]: time="2025-01-13T21:07:26.650086670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:2,}" Jan 13 21:07:26.776742 containerd[1650]: time="2025-01-13T21:07:26.776716742Z" level=error msg="Failed to destroy network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.778293 containerd[1650]: time="2025-01-13T21:07:26.778273728Z" level=error msg="encountered an error cleaning up failed sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.778487 containerd[1650]: time="2025-01-13T21:07:26.778469935Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.778824 kubelet[3044]: E0113 21:07:26.778613 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.778824 kubelet[3044]: E0113 21:07:26.778659 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:26.778824 kubelet[3044]: E0113 21:07:26.778675 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:26.778928 kubelet[3044]: E0113 21:07:26.778717 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" Jan 13 21:07:26.782531 containerd[1650]: time="2025-01-13T21:07:26.782507561Z" level=error msg="Failed to destroy network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.784264 containerd[1650]: time="2025-01-13T21:07:26.784243494Z" level=error msg="encountered an error cleaning up failed sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.784310 containerd[1650]: time="2025-01-13T21:07:26.784283528Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.784526 kubelet[3044]: E0113 21:07:26.784417 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.784526 kubelet[3044]: E0113 21:07:26.784457 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:26.784526 kubelet[3044]: E0113 21:07:26.784471 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:26.784622 kubelet[3044]: E0113 21:07:26.784506 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" Jan 13 21:07:26.790666 containerd[1650]: time="2025-01-13T21:07:26.790640426Z" level=error msg="Failed to destroy network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792501 containerd[1650]: time="2025-01-13T21:07:26.790888918Z" level=error msg="encountered an error cleaning up failed sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792501 containerd[1650]: time="2025-01-13T21:07:26.790925137Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792501 containerd[1650]: time="2025-01-13T21:07:26.791689444Z" level=error msg="Failed to destroy network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792501 containerd[1650]: time="2025-01-13T21:07:26.791864531Z" level=error msg="encountered an error cleaning up failed sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792501 containerd[1650]: time="2025-01-13T21:07:26.791887386Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:3,} failed, error" error="failed to setup network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792670 kubelet[3044]: E0113 21:07:26.791042 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792670 kubelet[3044]: E0113 21:07:26.791076 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:26.792670 kubelet[3044]: E0113 21:07:26.791089 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:26.792738 kubelet[3044]: E0113 21:07:26.791121 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bhkcp" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" Jan 13 21:07:26.792738 kubelet[3044]: E0113 21:07:26.792250 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.792738 kubelet[3044]: E0113 21:07:26.792270 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:26.792864 kubelet[3044]: E0113 21:07:26.792283 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:26.792864 kubelet[3044]: E0113 21:07:26.792307 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" Jan 13 21:07:26.806754 containerd[1650]: time="2025-01-13T21:07:26.806729267Z" level=error msg="Failed to destroy network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.807443 containerd[1650]: time="2025-01-13T21:07:26.807231029Z" level=error msg="encountered an error cleaning up failed sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.807792 containerd[1650]: time="2025-01-13T21:07:26.807777816Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:2,} failed, error" error="failed to setup network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.807919 containerd[1650]: time="2025-01-13T21:07:26.807611467Z" level=error msg="Failed to destroy network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.808108 kubelet[3044]: E0113 21:07:26.808091 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.808242 kubelet[3044]: E0113 21:07:26.808128 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:26.808242 kubelet[3044]: E0113 21:07:26.808147 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:26.808242 kubelet[3044]: E0113 21:07:26.808181 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:26.808613 containerd[1650]: time="2025-01-13T21:07:26.808356290Z" level=error msg="encountered an error cleaning up failed sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.808613 containerd[1650]: time="2025-01-13T21:07:26.808458087Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.808703 kubelet[3044]: E0113 21:07:26.808542 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:26.808703 kubelet[3044]: E0113 21:07:26.808562 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:26.808703 kubelet[3044]: E0113 21:07:26.808573 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:26.808770 kubelet[3044]: E0113 21:07:26.808595 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bbgth" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" Jan 13 21:07:27.414496 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c-shm.mount: Deactivated successfully. Jan 13 21:07:27.414700 systemd[1]: run-netns-cni\x2d22735f29\x2d79e7\x2d20bc\x2d2695\x2d0e79b0168c42.mount: Deactivated successfully. Jan 13 21:07:27.414829 systemd[1]: run-netns-cni\x2d31f7ea5b\x2d9549\x2dfb80\x2ddf99\x2da306764933e4.mount: Deactivated successfully. Jan 13 21:07:27.414933 systemd[1]: run-netns-cni\x2d8d964225\x2d0322\x2d2b6e\x2d33f2\x2d974625ad4a3c.mount: Deactivated successfully. Jan 13 21:07:27.651606 kubelet[3044]: I0113 21:07:27.651589 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7" Jan 13 21:07:27.652433 containerd[1650]: time="2025-01-13T21:07:27.652416285Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" Jan 13 21:07:27.652731 containerd[1650]: time="2025-01-13T21:07:27.652536708Z" level=info msg="Ensure that sandbox 5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7 in task-service has been cleanup successfully" Jan 13 21:07:27.654793 systemd[1]: run-netns-cni\x2df50499cb\x2d966b\x2d3f18\x2d8fea\x2d6cbbb332668f.mount: Deactivated successfully. Jan 13 21:07:27.655305 containerd[1650]: time="2025-01-13T21:07:27.655050888Z" level=info msg="TearDown network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" successfully" Jan 13 21:07:27.655305 containerd[1650]: time="2025-01-13T21:07:27.655157722Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" returns successfully" Jan 13 21:07:27.655375 containerd[1650]: time="2025-01-13T21:07:27.655341821Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:27.655477 containerd[1650]: time="2025-01-13T21:07:27.655406522Z" level=info msg="TearDown network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" successfully" Jan 13 21:07:27.655477 containerd[1650]: time="2025-01-13T21:07:27.655415333Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" returns successfully" Jan 13 21:07:27.655647 containerd[1650]: time="2025-01-13T21:07:27.655616141Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:27.655673 containerd[1650]: time="2025-01-13T21:07:27.655655288Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:27.655673 containerd[1650]: time="2025-01-13T21:07:27.655662171Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:27.655922 containerd[1650]: time="2025-01-13T21:07:27.655908464Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:27.656623 containerd[1650]: time="2025-01-13T21:07:27.656001578Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:27.656623 containerd[1650]: time="2025-01-13T21:07:27.656010566Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:27.656677 kubelet[3044]: I0113 21:07:27.656551 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6" Jan 13 21:07:27.657388 containerd[1650]: time="2025-01-13T21:07:27.657336547Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" Jan 13 21:07:27.657466 containerd[1650]: time="2025-01-13T21:07:27.657447197Z" level=info msg="Ensure that sandbox dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6 in task-service has been cleanup successfully" Jan 13 21:07:27.659091 containerd[1650]: time="2025-01-13T21:07:27.659075049Z" level=info msg="TearDown network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" successfully" Jan 13 21:07:27.659091 containerd[1650]: time="2025-01-13T21:07:27.659089252Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" returns successfully" Jan 13 21:07:27.659841 containerd[1650]: time="2025-01-13T21:07:27.659158485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:4,}" Jan 13 21:07:27.660010 systemd[1]: run-netns-cni\x2d9e131b3e\x2d7314\x2df8ee\x2d1505\x2dd19ff363213c.mount: Deactivated successfully. Jan 13 21:07:27.661750 containerd[1650]: time="2025-01-13T21:07:27.660944910Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:27.661750 containerd[1650]: time="2025-01-13T21:07:27.660992203Z" level=info msg="TearDown network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" successfully" Jan 13 21:07:27.661750 containerd[1650]: time="2025-01-13T21:07:27.660998785Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" returns successfully" Jan 13 21:07:27.661861 containerd[1650]: time="2025-01-13T21:07:27.661760785Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:27.661861 containerd[1650]: time="2025-01-13T21:07:27.661798234Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:27.661861 containerd[1650]: time="2025-01-13T21:07:27.661845660Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:27.679996 containerd[1650]: time="2025-01-13T21:07:27.679078605Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:27.679996 containerd[1650]: time="2025-01-13T21:07:27.679230214Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:27.679996 containerd[1650]: time="2025-01-13T21:07:27.679238640Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:27.680114 kubelet[3044]: I0113 21:07:27.679218 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c" Jan 13 21:07:27.684749 containerd[1650]: time="2025-01-13T21:07:27.684132232Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" Jan 13 21:07:27.684749 containerd[1650]: time="2025-01-13T21:07:27.684695793Z" level=info msg="Ensure that sandbox 577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c in task-service has been cleanup successfully" Jan 13 21:07:27.685777 containerd[1650]: time="2025-01-13T21:07:27.685179135Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:4,}" Jan 13 21:07:27.685777 containerd[1650]: time="2025-01-13T21:07:27.685714106Z" level=info msg="TearDown network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" successfully" Jan 13 21:07:27.685777 containerd[1650]: time="2025-01-13T21:07:27.685724161Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" returns successfully" Jan 13 21:07:27.687753 containerd[1650]: time="2025-01-13T21:07:27.687730791Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:27.688352 containerd[1650]: time="2025-01-13T21:07:27.688338754Z" level=info msg="TearDown network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" successfully" Jan 13 21:07:27.688352 containerd[1650]: time="2025-01-13T21:07:27.688350414Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" returns successfully" Jan 13 21:07:27.688753 containerd[1650]: time="2025-01-13T21:07:27.688496729Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:27.688753 containerd[1650]: time="2025-01-13T21:07:27.688535415Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:27.688753 containerd[1650]: time="2025-01-13T21:07:27.688541381Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:27.689538 containerd[1650]: time="2025-01-13T21:07:27.689495562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:3,}" Jan 13 21:07:27.691051 kubelet[3044]: I0113 21:07:27.690281 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c" Jan 13 21:07:27.691110 containerd[1650]: time="2025-01-13T21:07:27.690631147Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" Jan 13 21:07:27.691110 containerd[1650]: time="2025-01-13T21:07:27.690755931Z" level=info msg="Ensure that sandbox 2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c in task-service has been cleanup successfully" Jan 13 21:07:27.691110 containerd[1650]: time="2025-01-13T21:07:27.690873003Z" level=info msg="TearDown network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" successfully" Jan 13 21:07:27.691110 containerd[1650]: time="2025-01-13T21:07:27.690881214Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" returns successfully" Jan 13 21:07:27.691731 containerd[1650]: time="2025-01-13T21:07:27.691714957Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:27.691911 containerd[1650]: time="2025-01-13T21:07:27.691899391Z" level=info msg="TearDown network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" successfully" Jan 13 21:07:27.691911 containerd[1650]: time="2025-01-13T21:07:27.691907790Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" returns successfully" Jan 13 21:07:27.692249 containerd[1650]: time="2025-01-13T21:07:27.692237634Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:27.692489 containerd[1650]: time="2025-01-13T21:07:27.692441883Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:27.692489 containerd[1650]: time="2025-01-13T21:07:27.692453034Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:27.693737 containerd[1650]: time="2025-01-13T21:07:27.693723851Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:27.693991 containerd[1650]: time="2025-01-13T21:07:27.693913481Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:27.693991 containerd[1650]: time="2025-01-13T21:07:27.693922848Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:27.694591 kubelet[3044]: I0113 21:07:27.694391 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5" Jan 13 21:07:27.695998 containerd[1650]: time="2025-01-13T21:07:27.695917278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:4,}" Jan 13 21:07:27.696940 containerd[1650]: time="2025-01-13T21:07:27.696926074Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" Jan 13 21:07:27.697300 containerd[1650]: time="2025-01-13T21:07:27.697204824Z" level=info msg="Ensure that sandbox f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5 in task-service has been cleanup successfully" Jan 13 21:07:27.698041 containerd[1650]: time="2025-01-13T21:07:27.698027075Z" level=info msg="TearDown network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" successfully" Jan 13 21:07:27.698125 containerd[1650]: time="2025-01-13T21:07:27.698116489Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" returns successfully" Jan 13 21:07:27.700187 containerd[1650]: time="2025-01-13T21:07:27.700155510Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:27.700239 containerd[1650]: time="2025-01-13T21:07:27.700218057Z" level=info msg="TearDown network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" successfully" Jan 13 21:07:27.700239 containerd[1650]: time="2025-01-13T21:07:27.700228318Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" returns successfully" Jan 13 21:07:27.702823 containerd[1650]: time="2025-01-13T21:07:27.702317093Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:27.703362 containerd[1650]: time="2025-01-13T21:07:27.702411814Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:27.703798 containerd[1650]: time="2025-01-13T21:07:27.703747827Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:27.704896 containerd[1650]: time="2025-01-13T21:07:27.704885085Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:27.705000 containerd[1650]: time="2025-01-13T21:07:27.704990381Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:27.705422 containerd[1650]: time="2025-01-13T21:07:27.705082664Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:27.707864 containerd[1650]: time="2025-01-13T21:07:27.707843241Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:4,}" Jan 13 21:07:27.708372 kubelet[3044]: I0113 21:07:27.708360 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8" Jan 13 21:07:27.711638 containerd[1650]: time="2025-01-13T21:07:27.711617831Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" Jan 13 21:07:27.713083 containerd[1650]: time="2025-01-13T21:07:27.712746408Z" level=info msg="Ensure that sandbox f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8 in task-service has been cleanup successfully" Jan 13 21:07:27.713083 containerd[1650]: time="2025-01-13T21:07:27.712871104Z" level=info msg="TearDown network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" successfully" Jan 13 21:07:27.713083 containerd[1650]: time="2025-01-13T21:07:27.712880312Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" returns successfully" Jan 13 21:07:27.713826 containerd[1650]: time="2025-01-13T21:07:27.713691114Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:27.715164 containerd[1650]: time="2025-01-13T21:07:27.714884729Z" level=info msg="TearDown network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" successfully" Jan 13 21:07:27.715623 containerd[1650]: time="2025-01-13T21:07:27.715612479Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" returns successfully" Jan 13 21:07:27.719187 containerd[1650]: time="2025-01-13T21:07:27.719165725Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:27.720195 containerd[1650]: time="2025-01-13T21:07:27.720139529Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:27.720195 containerd[1650]: time="2025-01-13T21:07:27.720150932Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:27.720645 containerd[1650]: time="2025-01-13T21:07:27.720625465Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:27.720680 containerd[1650]: time="2025-01-13T21:07:27.720673062Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:27.720699 containerd[1650]: time="2025-01-13T21:07:27.720680106Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:27.721911 containerd[1650]: time="2025-01-13T21:07:27.721827367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:4,}" Jan 13 21:07:27.742818 containerd[1650]: time="2025-01-13T21:07:27.742778001Z" level=error msg="Failed to destroy network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.743289 containerd[1650]: time="2025-01-13T21:07:27.743176222Z" level=error msg="encountered an error cleaning up failed sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.743289 containerd[1650]: time="2025-01-13T21:07:27.743220327Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.743657 kubelet[3044]: E0113 21:07:27.743452 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.743657 kubelet[3044]: E0113 21:07:27.743487 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:27.743657 kubelet[3044]: E0113 21:07:27.743502 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:27.744440 kubelet[3044]: E0113 21:07:27.743536 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" Jan 13 21:07:27.805431 containerd[1650]: time="2025-01-13T21:07:27.805338722Z" level=error msg="Failed to destroy network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.806679 containerd[1650]: time="2025-01-13T21:07:27.806639677Z" level=error msg="encountered an error cleaning up failed sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.806807 containerd[1650]: time="2025-01-13T21:07:27.806746200Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:3,} failed, error" error="failed to setup network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.807176 kubelet[3044]: E0113 21:07:27.807113 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.807176 kubelet[3044]: E0113 21:07:27.807146 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:27.807176 kubelet[3044]: E0113 21:07:27.807159 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:27.807780 kubelet[3044]: E0113 21:07:27.807303 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:27.846249 containerd[1650]: time="2025-01-13T21:07:27.843724663Z" level=error msg="Failed to destroy network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.846341 containerd[1650]: time="2025-01-13T21:07:27.846251050Z" level=error msg="encountered an error cleaning up failed sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.846341 containerd[1650]: time="2025-01-13T21:07:27.846287199Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:4,} failed, error" error="failed to setup network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.847162 kubelet[3044]: E0113 21:07:27.846513 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.847162 kubelet[3044]: E0113 21:07:27.846554 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:27.847162 kubelet[3044]: E0113 21:07:27.846570 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:27.847250 kubelet[3044]: E0113 21:07:27.846605 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" Jan 13 21:07:27.852918 containerd[1650]: time="2025-01-13T21:07:27.852891532Z" level=error msg="Failed to destroy network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.853297 containerd[1650]: time="2025-01-13T21:07:27.853195718Z" level=error msg="encountered an error cleaning up failed sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.853297 containerd[1650]: time="2025-01-13T21:07:27.853232671Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.853564 kubelet[3044]: E0113 21:07:27.853464 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.853564 kubelet[3044]: E0113 21:07:27.853501 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:27.853564 kubelet[3044]: E0113 21:07:27.853514 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:27.853648 kubelet[3044]: E0113 21:07:27.853546 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" Jan 13 21:07:27.862742 containerd[1650]: time="2025-01-13T21:07:27.862671153Z" level=error msg="Failed to destroy network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.863338 containerd[1650]: time="2025-01-13T21:07:27.862999961Z" level=error msg="encountered an error cleaning up failed sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.863338 containerd[1650]: time="2025-01-13T21:07:27.863036062Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.863411 kubelet[3044]: E0113 21:07:27.863228 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.863411 kubelet[3044]: E0113 21:07:27.863264 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:27.863411 kubelet[3044]: E0113 21:07:27.863292 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:27.863483 kubelet[3044]: E0113 21:07:27.863327 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bhkcp" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" Jan 13 21:07:27.867261 containerd[1650]: time="2025-01-13T21:07:27.867242938Z" level=error msg="Failed to destroy network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.867485 containerd[1650]: time="2025-01-13T21:07:27.867471763Z" level=error msg="encountered an error cleaning up failed sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.867643 containerd[1650]: time="2025-01-13T21:07:27.867632216Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.868028 kubelet[3044]: E0113 21:07:27.868009 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:27.868080 kubelet[3044]: E0113 21:07:27.868039 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:27.868080 kubelet[3044]: E0113 21:07:27.868054 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:27.868123 kubelet[3044]: E0113 21:07:27.868087 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bbgth" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" Jan 13 21:07:28.414546 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82-shm.mount: Deactivated successfully. Jan 13 21:07:28.414761 systemd[1]: run-netns-cni\x2d67fd05ca\x2dc9ef\x2d2724\x2d011c\x2df530eeaa9644.mount: Deactivated successfully. Jan 13 21:07:28.415276 systemd[1]: run-netns-cni\x2d7d32ef52\x2dd6be\x2d6e66\x2d93c5\x2de0afec751ee4.mount: Deactivated successfully. Jan 13 21:07:28.415395 systemd[1]: run-netns-cni\x2d706872e1\x2d4d57\x2dfedc\x2ddfcb\x2d314edaa058f4.mount: Deactivated successfully. Jan 13 21:07:28.415493 systemd[1]: run-netns-cni\x2d5dab471f\x2dce4c\x2d0ad1\x2dabe3\x2d892212303bf3.mount: Deactivated successfully. Jan 13 21:07:28.772126 kubelet[3044]: I0113 21:07:28.771003 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481" Jan 13 21:07:28.772464 containerd[1650]: time="2025-01-13T21:07:28.771459221Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\"" Jan 13 21:07:28.784599 containerd[1650]: time="2025-01-13T21:07:28.784567846Z" level=info msg="Ensure that sandbox 1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481 in task-service has been cleanup successfully" Jan 13 21:07:28.786345 containerd[1650]: time="2025-01-13T21:07:28.786318137Z" level=info msg="TearDown network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" successfully" Jan 13 21:07:28.786345 containerd[1650]: time="2025-01-13T21:07:28.786333238Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" returns successfully" Jan 13 21:07:28.786639 systemd[1]: run-netns-cni\x2dae8584f5\x2d4b03\x2d701f\x2dfb54\x2dfeb40af72974.mount: Deactivated successfully. Jan 13 21:07:28.788370 containerd[1650]: time="2025-01-13T21:07:28.788350444Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" Jan 13 21:07:28.788408 containerd[1650]: time="2025-01-13T21:07:28.788401290Z" level=info msg="TearDown network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" successfully" Jan 13 21:07:28.788430 containerd[1650]: time="2025-01-13T21:07:28.788408382Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" returns successfully" Jan 13 21:07:28.790053 containerd[1650]: time="2025-01-13T21:07:28.790010005Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:28.790092 containerd[1650]: time="2025-01-13T21:07:28.790063724Z" level=info msg="TearDown network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" successfully" Jan 13 21:07:28.790092 containerd[1650]: time="2025-01-13T21:07:28.790070923Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" returns successfully" Jan 13 21:07:28.790222 containerd[1650]: time="2025-01-13T21:07:28.790201764Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:28.790274 containerd[1650]: time="2025-01-13T21:07:28.790248297Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:28.790274 containerd[1650]: time="2025-01-13T21:07:28.790256672Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:28.793697 containerd[1650]: time="2025-01-13T21:07:28.793537471Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:28.793697 containerd[1650]: time="2025-01-13T21:07:28.793638974Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:28.793697 containerd[1650]: time="2025-01-13T21:07:28.793685173Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:28.794531 containerd[1650]: time="2025-01-13T21:07:28.794514442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:5,}" Jan 13 21:07:28.799681 kubelet[3044]: I0113 21:07:28.798906 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771" Jan 13 21:07:28.799766 containerd[1650]: time="2025-01-13T21:07:28.799544051Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\"" Jan 13 21:07:28.799766 containerd[1650]: time="2025-01-13T21:07:28.799671081Z" level=info msg="Ensure that sandbox 5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771 in task-service has been cleanup successfully" Jan 13 21:07:28.800071 containerd[1650]: time="2025-01-13T21:07:28.800050732Z" level=info msg="TearDown network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" successfully" Jan 13 21:07:28.800071 containerd[1650]: time="2025-01-13T21:07:28.800063100Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" returns successfully" Jan 13 21:07:28.800348 containerd[1650]: time="2025-01-13T21:07:28.800330350Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.800455181Z" level=info msg="TearDown network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.800477099Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" returns successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801200531Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801237774Z" level=info msg="TearDown network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801243651Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" returns successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801452581Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801514649Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801521955Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801634804Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801667499Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801673066Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:28.803358 containerd[1650]: time="2025-01-13T21:07:28.801908707Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:5,}" Jan 13 21:07:28.803291 systemd[1]: run-netns-cni\x2dbe77e806\x2d12fb\x2db0b2\x2d080b\x2df35081407798.mount: Deactivated successfully. Jan 13 21:07:28.823932 kubelet[3044]: I0113 21:07:28.823674 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82" Jan 13 21:07:28.824682 containerd[1650]: time="2025-01-13T21:07:28.824589005Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\"" Jan 13 21:07:28.825871 containerd[1650]: time="2025-01-13T21:07:28.825850394Z" level=info msg="Ensure that sandbox 6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82 in task-service has been cleanup successfully" Jan 13 21:07:28.827827 containerd[1650]: time="2025-01-13T21:07:28.826225581Z" level=info msg="TearDown network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" successfully" Jan 13 21:07:28.827827 containerd[1650]: time="2025-01-13T21:07:28.826237121Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" returns successfully" Jan 13 21:07:28.827455 systemd[1]: run-netns-cni\x2def069873\x2d1a4a\x2d800f\x2dfd00\x2dc845fbf8edc4.mount: Deactivated successfully. Jan 13 21:07:28.828629 containerd[1650]: time="2025-01-13T21:07:28.827971740Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" Jan 13 21:07:28.828629 containerd[1650]: time="2025-01-13T21:07:28.828023896Z" level=info msg="TearDown network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" successfully" Jan 13 21:07:28.828629 containerd[1650]: time="2025-01-13T21:07:28.828030156Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" returns successfully" Jan 13 21:07:28.829221 containerd[1650]: time="2025-01-13T21:07:28.829138939Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:28.829221 containerd[1650]: time="2025-01-13T21:07:28.829179113Z" level=info msg="TearDown network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" successfully" Jan 13 21:07:28.829221 containerd[1650]: time="2025-01-13T21:07:28.829185308Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" returns successfully" Jan 13 21:07:28.831404 containerd[1650]: time="2025-01-13T21:07:28.831393702Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:28.831497 containerd[1650]: time="2025-01-13T21:07:28.831466051Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:28.831558 containerd[1650]: time="2025-01-13T21:07:28.831551131Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:28.842705 containerd[1650]: time="2025-01-13T21:07:28.842620723Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:28.842705 containerd[1650]: time="2025-01-13T21:07:28.842665627Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:28.842705 containerd[1650]: time="2025-01-13T21:07:28.842672935Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:28.843415 containerd[1650]: time="2025-01-13T21:07:28.843254789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:5,}" Jan 13 21:07:28.843952 kubelet[3044]: I0113 21:07:28.843937 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1" Jan 13 21:07:28.844651 containerd[1650]: time="2025-01-13T21:07:28.844462436Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\"" Jan 13 21:07:28.844651 containerd[1650]: time="2025-01-13T21:07:28.844572293Z" level=info msg="Ensure that sandbox 099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1 in task-service has been cleanup successfully" Jan 13 21:07:28.844841 containerd[1650]: time="2025-01-13T21:07:28.844830669Z" level=info msg="TearDown network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" successfully" Jan 13 21:07:28.844881 containerd[1650]: time="2025-01-13T21:07:28.844874645Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" returns successfully" Jan 13 21:07:28.845382 containerd[1650]: time="2025-01-13T21:07:28.845359697Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" Jan 13 21:07:28.845522 containerd[1650]: time="2025-01-13T21:07:28.845512888Z" level=info msg="TearDown network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" successfully" Jan 13 21:07:28.845522 containerd[1650]: time="2025-01-13T21:07:28.845545167Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" returns successfully" Jan 13 21:07:28.846421 containerd[1650]: time="2025-01-13T21:07:28.845760709Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846525837Z" level=info msg="TearDown network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" successfully" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846544869Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" returns successfully" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846670635Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846707834Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846713571Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846837302Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846869641Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:28.847794 containerd[1650]: time="2025-01-13T21:07:28.846891044Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:28.848243 containerd[1650]: time="2025-01-13T21:07:28.848135064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:5,}" Jan 13 21:07:28.848776 systemd[1]: run-netns-cni\x2d9cbc0391\x2df934\x2dfb5c\x2d0dc4\x2d54c43851a7b2.mount: Deactivated successfully. Jan 13 21:07:28.889070 kubelet[3044]: I0113 21:07:28.888976 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857" Jan 13 21:07:28.890245 containerd[1650]: time="2025-01-13T21:07:28.889751174Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\"" Jan 13 21:07:28.896778 kubelet[3044]: I0113 21:07:28.896762 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37" Jan 13 21:07:28.897524 containerd[1650]: time="2025-01-13T21:07:28.897503033Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\"" Jan 13 21:07:28.899423 containerd[1650]: time="2025-01-13T21:07:28.899407214Z" level=info msg="Ensure that sandbox 0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37 in task-service has been cleanup successfully" Jan 13 21:07:28.901071 containerd[1650]: time="2025-01-13T21:07:28.901056221Z" level=info msg="TearDown network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" successfully" Jan 13 21:07:28.901071 containerd[1650]: time="2025-01-13T21:07:28.901068479Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" returns successfully" Jan 13 21:07:28.903034 containerd[1650]: time="2025-01-13T21:07:28.901977621Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" Jan 13 21:07:28.903034 containerd[1650]: time="2025-01-13T21:07:28.902024863Z" level=info msg="TearDown network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" successfully" Jan 13 21:07:28.903034 containerd[1650]: time="2025-01-13T21:07:28.902031256Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" returns successfully" Jan 13 21:07:28.903513 containerd[1650]: time="2025-01-13T21:07:28.902869368Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:28.903513 containerd[1650]: time="2025-01-13T21:07:28.903164136Z" level=info msg="TearDown network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" successfully" Jan 13 21:07:28.903513 containerd[1650]: time="2025-01-13T21:07:28.903255255Z" level=info msg="Ensure that sandbox 7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857 in task-service has been cleanup successfully" Jan 13 21:07:28.903513 containerd[1650]: time="2025-01-13T21:07:28.903186109Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" returns successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.903687834Z" level=info msg="TearDown network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.903696674Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" returns successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.904578552Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.904631994Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.904639898Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.904871213Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.904907843Z" level=info msg="TearDown network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.905232274Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" returns successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.905651928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:4,}" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.906407334Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.906445271Z" level=info msg="TearDown network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.906451333Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" returns successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.908074501Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.908117735Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.908174271Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.909224274Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.909261845Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.909274571Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:28.918443 containerd[1650]: time="2025-01-13T21:07:28.911535556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:5,}" Jan 13 21:07:28.989133 containerd[1650]: time="2025-01-13T21:07:28.989099820Z" level=error msg="Failed to destroy network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.989708 containerd[1650]: time="2025-01-13T21:07:28.989688038Z" level=error msg="encountered an error cleaning up failed sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.990057 containerd[1650]: time="2025-01-13T21:07:28.990025058Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.991044 kubelet[3044]: E0113 21:07:28.990480 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.991044 kubelet[3044]: E0113 21:07:28.990512 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:28.991044 kubelet[3044]: E0113 21:07:28.990525 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:28.991148 kubelet[3044]: E0113 21:07:28.990558 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" Jan 13 21:07:28.993112 containerd[1650]: time="2025-01-13T21:07:28.993063309Z" level=error msg="Failed to destroy network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.993332 containerd[1650]: time="2025-01-13T21:07:28.993242725Z" level=error msg="encountered an error cleaning up failed sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.993332 containerd[1650]: time="2025-01-13T21:07:28.993275570Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.993332 containerd[1650]: time="2025-01-13T21:07:28.993308940Z" level=error msg="Failed to destroy network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.993562 kubelet[3044]: E0113 21:07:28.993370 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.993562 kubelet[3044]: E0113 21:07:28.993407 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:28.993562 kubelet[3044]: E0113 21:07:28.993420 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:28.993654 kubelet[3044]: E0113 21:07:28.993471 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bbgth" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" Jan 13 21:07:28.994293 containerd[1650]: time="2025-01-13T21:07:28.993887762Z" level=error msg="encountered an error cleaning up failed sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.994293 containerd[1650]: time="2025-01-13T21:07:28.994146900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.994765 kubelet[3044]: E0113 21:07:28.994754 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:28.994818 kubelet[3044]: E0113 21:07:28.994781 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:28.994818 kubelet[3044]: E0113 21:07:28.994793 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:28.994873 kubelet[3044]: E0113 21:07:28.994838 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" Jan 13 21:07:29.012057 containerd[1650]: time="2025-01-13T21:07:29.012019610Z" level=error msg="Failed to destroy network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.012220 containerd[1650]: time="2025-01-13T21:07:29.012198918Z" level=error msg="encountered an error cleaning up failed sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.012252 containerd[1650]: time="2025-01-13T21:07:29.012231616Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:5,} failed, error" error="failed to setup network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.012593 kubelet[3044]: E0113 21:07:29.012457 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.013041 kubelet[3044]: E0113 21:07:29.012643 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:29.013041 kubelet[3044]: E0113 21:07:29.012662 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:29.013041 kubelet[3044]: E0113 21:07:29.012695 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" Jan 13 21:07:29.138539 containerd[1650]: time="2025-01-13T21:07:29.138331620Z" level=error msg="Failed to destroy network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.138622 containerd[1650]: time="2025-01-13T21:07:29.138535461Z" level=error msg="encountered an error cleaning up failed sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.138622 containerd[1650]: time="2025-01-13T21:07:29.138567329Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:4,} failed, error" error="failed to setup network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.139820 kubelet[3044]: E0113 21:07:29.138831 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.139820 kubelet[3044]: E0113 21:07:29.138881 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:29.139820 kubelet[3044]: E0113 21:07:29.138900 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:29.139920 kubelet[3044]: E0113 21:07:29.139035 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:29.144301 containerd[1650]: time="2025-01-13T21:07:29.144271599Z" level=error msg="Failed to destroy network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.144749 containerd[1650]: time="2025-01-13T21:07:29.144731273Z" level=error msg="encountered an error cleaning up failed sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.145206 containerd[1650]: time="2025-01-13T21:07:29.144925846Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.145758 kubelet[3044]: E0113 21:07:29.145042 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:29.145758 kubelet[3044]: E0113 21:07:29.145070 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:29.145758 kubelet[3044]: E0113 21:07:29.145082 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:29.145878 kubelet[3044]: E0113 21:07:29.145109 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bhkcp" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" Jan 13 21:07:29.416660 systemd[1]: run-netns-cni\x2d92e50b12\x2d0112\x2d6740\x2d34a0\x2d122ca11ae341.mount: Deactivated successfully. Jan 13 21:07:29.416742 systemd[1]: run-netns-cni\x2dd7287adc\x2d0154\x2df4d3\x2dadc8\x2df30baca25215.mount: Deactivated successfully. Jan 13 21:07:29.746701 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3991061319.mount: Deactivated successfully. Jan 13 21:07:29.865796 containerd[1650]: time="2025-01-13T21:07:29.860000967Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.1: active requests=0, bytes read=142742010" Jan 13 21:07:29.892229 containerd[1650]: time="2025-01-13T21:07:29.891487099Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:29.921948 containerd[1650]: time="2025-01-13T21:07:29.921867882Z" level=info msg="ImageCreate event name:\"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:29.923324 containerd[1650]: time="2025-01-13T21:07:29.922722451Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:29.929424 kubelet[3044]: I0113 21:07:29.928968 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25" Jan 13 21:07:29.930311 kubelet[3044]: I0113 21:07:29.930112 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee" Jan 13 21:07:29.932618 containerd[1650]: time="2025-01-13T21:07:29.932547630Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.1\" with image id \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1\", size \"142741872\" in 6.375354391s" Jan 13 21:07:29.932618 containerd[1650]: time="2025-01-13T21:07:29.932569169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Jan 13 21:07:29.939102 containerd[1650]: time="2025-01-13T21:07:29.938960207Z" level=info msg="StopPodSandbox for \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\"" Jan 13 21:07:29.939102 containerd[1650]: time="2025-01-13T21:07:29.939074594Z" level=info msg="StopPodSandbox for \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\"" Jan 13 21:07:29.939427 containerd[1650]: time="2025-01-13T21:07:29.939295608Z" level=info msg="Ensure that sandbox f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25 in task-service has been cleanup successfully" Jan 13 21:07:29.940984 containerd[1650]: time="2025-01-13T21:07:29.940462664Z" level=info msg="TearDown network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" successfully" Jan 13 21:07:29.940984 containerd[1650]: time="2025-01-13T21:07:29.940475269Z" level=info msg="StopPodSandbox for \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" returns successfully" Jan 13 21:07:29.943778 containerd[1650]: time="2025-01-13T21:07:29.942257268Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\"" Jan 13 21:07:29.943778 containerd[1650]: time="2025-01-13T21:07:29.942311129Z" level=info msg="TearDown network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" successfully" Jan 13 21:07:29.943778 containerd[1650]: time="2025-01-13T21:07:29.942317664Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" returns successfully" Jan 13 21:07:29.943778 containerd[1650]: time="2025-01-13T21:07:29.943023272Z" level=info msg="Ensure that sandbox 2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee in task-service has been cleanup successfully" Jan 13 21:07:29.942927 systemd[1]: run-netns-cni\x2d669b2d98\x2d8f19\x2da01b\x2d5ba8\x2d7e49bec80ebc.mount: Deactivated successfully. Jan 13 21:07:29.946456 systemd[1]: run-netns-cni\x2dc9d9def2\x2dfd33\x2daf1c\x2d186b\x2d5959ba7344f9.mount: Deactivated successfully. Jan 13 21:07:29.946965 containerd[1650]: time="2025-01-13T21:07:29.944903005Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" Jan 13 21:07:29.946965 containerd[1650]: time="2025-01-13T21:07:29.944956128Z" level=info msg="TearDown network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" successfully" Jan 13 21:07:29.946965 containerd[1650]: time="2025-01-13T21:07:29.944963562Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" returns successfully" Jan 13 21:07:29.947124 containerd[1650]: time="2025-01-13T21:07:29.947110676Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:29.947210 containerd[1650]: time="2025-01-13T21:07:29.947201521Z" level=info msg="TearDown network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" successfully" Jan 13 21:07:29.947264 containerd[1650]: time="2025-01-13T21:07:29.947255634Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" returns successfully" Jan 13 21:07:29.947669 containerd[1650]: time="2025-01-13T21:07:29.947631764Z" level=info msg="TearDown network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" successfully" Jan 13 21:07:29.947669 containerd[1650]: time="2025-01-13T21:07:29.947642158Z" level=info msg="StopPodSandbox for \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" returns successfully" Jan 13 21:07:29.948662 containerd[1650]: time="2025-01-13T21:07:29.948557522Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\"" Jan 13 21:07:29.948713 containerd[1650]: time="2025-01-13T21:07:29.948702403Z" level=info msg="TearDown network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" successfully" Jan 13 21:07:29.948713 containerd[1650]: time="2025-01-13T21:07:29.948710067Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" returns successfully" Jan 13 21:07:29.948756 containerd[1650]: time="2025-01-13T21:07:29.948609925Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:29.948795 containerd[1650]: time="2025-01-13T21:07:29.948781841Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:29.948795 containerd[1650]: time="2025-01-13T21:07:29.948790993Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:29.953560 containerd[1650]: time="2025-01-13T21:07:29.953471339Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" Jan 13 21:07:29.953560 containerd[1650]: time="2025-01-13T21:07:29.953510216Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:29.953560 containerd[1650]: time="2025-01-13T21:07:29.953548202Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:29.953560 containerd[1650]: time="2025-01-13T21:07:29.953554684Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:29.953661 containerd[1650]: time="2025-01-13T21:07:29.953515690Z" level=info msg="TearDown network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" successfully" Jan 13 21:07:29.953661 containerd[1650]: time="2025-01-13T21:07:29.953581055Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" returns successfully" Jan 13 21:07:29.954051 containerd[1650]: time="2025-01-13T21:07:29.954035698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:6,}" Jan 13 21:07:29.954100 containerd[1650]: time="2025-01-13T21:07:29.954090106Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:29.954455 containerd[1650]: time="2025-01-13T21:07:29.954156056Z" level=info msg="TearDown network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" successfully" Jan 13 21:07:29.954455 containerd[1650]: time="2025-01-13T21:07:29.954170790Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" returns successfully" Jan 13 21:07:29.954501 kubelet[3044]: I0113 21:07:29.954431 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be" Jan 13 21:07:29.954719 containerd[1650]: time="2025-01-13T21:07:29.954708062Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:29.954842 containerd[1650]: time="2025-01-13T21:07:29.954792691Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:29.954842 containerd[1650]: time="2025-01-13T21:07:29.954816430Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:29.955550 containerd[1650]: time="2025-01-13T21:07:29.955471017Z" level=info msg="StopPodSandbox for \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\"" Jan 13 21:07:29.959475 containerd[1650]: time="2025-01-13T21:07:29.957066444Z" level=info msg="Ensure that sandbox 1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be in task-service has been cleanup successfully" Jan 13 21:07:29.959475 containerd[1650]: time="2025-01-13T21:07:29.957298936Z" level=info msg="TearDown network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" successfully" Jan 13 21:07:29.959475 containerd[1650]: time="2025-01-13T21:07:29.957307740Z" level=info msg="StopPodSandbox for \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" returns successfully" Jan 13 21:07:29.959064 systemd[1]: run-netns-cni\x2db8085f61\x2d7333\x2da6ec\x2d8e4f\x2dbde19367ab2d.mount: Deactivated successfully. Jan 13 21:07:29.961744 containerd[1650]: time="2025-01-13T21:07:29.961726689Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:29.961871 containerd[1650]: time="2025-01-13T21:07:29.961779492Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:29.961871 containerd[1650]: time="2025-01-13T21:07:29.961830222Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:29.962181 containerd[1650]: time="2025-01-13T21:07:29.962065444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:6,}" Jan 13 21:07:29.962181 containerd[1650]: time="2025-01-13T21:07:29.962132481Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\"" Jan 13 21:07:29.962280 containerd[1650]: time="2025-01-13T21:07:29.962271025Z" level=info msg="TearDown network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" successfully" Jan 13 21:07:29.962314 containerd[1650]: time="2025-01-13T21:07:29.962307350Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" returns successfully" Jan 13 21:07:29.962604 containerd[1650]: time="2025-01-13T21:07:29.962541957Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" Jan 13 21:07:29.962637 containerd[1650]: time="2025-01-13T21:07:29.962582253Z" level=info msg="TearDown network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" successfully" Jan 13 21:07:29.962658 containerd[1650]: time="2025-01-13T21:07:29.962636443Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" returns successfully" Jan 13 21:07:29.962822 containerd[1650]: time="2025-01-13T21:07:29.962761972Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:29.962851 containerd[1650]: time="2025-01-13T21:07:29.962835470Z" level=info msg="TearDown network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" successfully" Jan 13 21:07:29.962851 containerd[1650]: time="2025-01-13T21:07:29.962842530Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" returns successfully" Jan 13 21:07:29.962981 containerd[1650]: time="2025-01-13T21:07:29.962940440Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:29.963058 containerd[1650]: time="2025-01-13T21:07:29.963025361Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:29.963058 containerd[1650]: time="2025-01-13T21:07:29.963033966Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:29.963290 containerd[1650]: time="2025-01-13T21:07:29.963230130Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:29.963290 containerd[1650]: time="2025-01-13T21:07:29.963267261Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:29.963290 containerd[1650]: time="2025-01-13T21:07:29.963273105Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:29.963438 kubelet[3044]: I0113 21:07:29.963423 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1" Jan 13 21:07:29.964424 containerd[1650]: time="2025-01-13T21:07:29.963967742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:6,}" Jan 13 21:07:29.965333 containerd[1650]: time="2025-01-13T21:07:29.964025001Z" level=info msg="StopPodSandbox for \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\"" Jan 13 21:07:29.965596 containerd[1650]: time="2025-01-13T21:07:29.965582570Z" level=info msg="Ensure that sandbox 23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1 in task-service has been cleanup successfully" Jan 13 21:07:29.966366 containerd[1650]: time="2025-01-13T21:07:29.966148679Z" level=info msg="TearDown network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" successfully" Jan 13 21:07:29.966366 containerd[1650]: time="2025-01-13T21:07:29.966160585Z" level=info msg="StopPodSandbox for \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" returns successfully" Jan 13 21:07:29.966366 containerd[1650]: time="2025-01-13T21:07:29.966294144Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\"" Jan 13 21:07:29.966366 containerd[1650]: time="2025-01-13T21:07:29.966354436Z" level=info msg="TearDown network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" successfully" Jan 13 21:07:29.966366 containerd[1650]: time="2025-01-13T21:07:29.966361561Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" returns successfully" Jan 13 21:07:29.966827 containerd[1650]: time="2025-01-13T21:07:29.966812149Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" Jan 13 21:07:29.966872 containerd[1650]: time="2025-01-13T21:07:29.966855199Z" level=info msg="TearDown network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" successfully" Jan 13 21:07:29.966872 containerd[1650]: time="2025-01-13T21:07:29.966864818Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" returns successfully" Jan 13 21:07:29.967017 kubelet[3044]: I0113 21:07:29.967005 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733" Jan 13 21:07:29.967228 containerd[1650]: time="2025-01-13T21:07:29.967147511Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:29.967228 containerd[1650]: time="2025-01-13T21:07:29.967188494Z" level=info msg="TearDown network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" successfully" Jan 13 21:07:29.967228 containerd[1650]: time="2025-01-13T21:07:29.967195086Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" returns successfully" Jan 13 21:07:29.967424 containerd[1650]: time="2025-01-13T21:07:29.967392100Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:29.967527 containerd[1650]: time="2025-01-13T21:07:29.967467726Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:29.967527 containerd[1650]: time="2025-01-13T21:07:29.967475570Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:29.967771 containerd[1650]: time="2025-01-13T21:07:29.967761813Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:5,}" Jan 13 21:07:29.968522 kubelet[3044]: I0113 21:07:29.968509 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.976910721Z" level=info msg="StopPodSandbox for \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\"" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977017761Z" level=info msg="Ensure that sandbox f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1 in task-service has been cleanup successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977162096Z" level=info msg="TearDown network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977170402Z" level=info msg="StopPodSandbox for \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" returns successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977383361Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\"" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977420544Z" level=info msg="TearDown network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977426114Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" returns successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977451885Z" level=info msg="StopPodSandbox for \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\"" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977525950Z" level=info msg="Ensure that sandbox a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733 in task-service has been cleanup successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977621821Z" level=info msg="TearDown network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977628982Z" level=info msg="StopPodSandbox for \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" returns successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977773725Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977821343Z" level=info msg="TearDown network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977827438Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" returns successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977853290Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\"" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977896904Z" level=info msg="TearDown network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.977904041Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" returns successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.978039521Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.978071408Z" level=info msg="TearDown network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.978076601Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" returns successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.978122666Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.978155270Z" level=info msg="TearDown network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" successfully" Jan 13 21:07:29.978210 containerd[1650]: time="2025-01-13T21:07:29.978160663Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" returns successfully" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978282780Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978314391Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978320864Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978343820Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978373311Z" level=info msg="TearDown network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" successfully" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978378797Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" returns successfully" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978501748Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978534578Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:29.978646 containerd[1650]: time="2025-01-13T21:07:29.978540777Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:29.978828 containerd[1650]: time="2025-01-13T21:07:29.978680146Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:29.978828 containerd[1650]: time="2025-01-13T21:07:29.978712511Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:29.978828 containerd[1650]: time="2025-01-13T21:07:29.978717876Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:29.978828 containerd[1650]: time="2025-01-13T21:07:29.978743600Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:29.978828 containerd[1650]: time="2025-01-13T21:07:29.978771198Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:29.978828 containerd[1650]: time="2025-01-13T21:07:29.978794063Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:29.979788 containerd[1650]: time="2025-01-13T21:07:29.979056624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:6,}" Jan 13 21:07:29.979788 containerd[1650]: time="2025-01-13T21:07:29.979265452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:6,}" Jan 13 21:07:29.989180 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:07:29.986987 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:07:29.987029 systemd-resolved[1547]: Flushed all caches. Jan 13 21:07:30.055756 containerd[1650]: time="2025-01-13T21:07:30.055275948Z" level=info msg="CreateContainer within sandbox \"c16a2b495884583b49887e9a3180e791003d36fc751f451766c61d91fdb3041b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 13 21:07:30.124196 containerd[1650]: time="2025-01-13T21:07:30.123732842Z" level=error msg="Failed to destroy network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.125694 containerd[1650]: time="2025-01-13T21:07:30.125665194Z" level=error msg="encountered an error cleaning up failed sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.126204 containerd[1650]: time="2025-01-13T21:07:30.126184159Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.126652 kubelet[3044]: E0113 21:07:30.126313 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.126652 kubelet[3044]: E0113 21:07:30.126352 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:30.126652 kubelet[3044]: E0113 21:07:30.126370 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" Jan 13 21:07:30.126721 kubelet[3044]: E0113 21:07:30.126407 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76db674964-8lv5h_calico-system(78856142-3d1a-4f19-a50a-e5f8505c2330)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podUID="78856142-3d1a-4f19-a50a-e5f8505c2330" Jan 13 21:07:30.133654 containerd[1650]: time="2025-01-13T21:07:30.133622281Z" level=error msg="Failed to destroy network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.139155 containerd[1650]: time="2025-01-13T21:07:30.134192688Z" level=error msg="encountered an error cleaning up failed sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.139155 containerd[1650]: time="2025-01-13T21:07:30.134462749Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.139264 kubelet[3044]: E0113 21:07:30.134622 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.139264 kubelet[3044]: E0113 21:07:30.134652 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:30.139264 kubelet[3044]: E0113 21:07:30.134667 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bhkcp" Jan 13 21:07:30.141092 kubelet[3044]: E0113 21:07:30.134701 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bhkcp_kube-system(000eb469-bf65-41e7-a410-d2b847f032c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bhkcp" podUID="000eb469-bf65-41e7-a410-d2b847f032c6" Jan 13 21:07:30.158334 containerd[1650]: time="2025-01-13T21:07:30.158308399Z" level=error msg="Failed to destroy network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.159717 containerd[1650]: time="2025-01-13T21:07:30.159702729Z" level=error msg="Failed to destroy network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.159967 containerd[1650]: time="2025-01-13T21:07:30.159953137Z" level=error msg="encountered an error cleaning up failed sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160243 containerd[1650]: time="2025-01-13T21:07:30.160224673Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:5,} failed, error" error="failed to setup network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160321 containerd[1650]: time="2025-01-13T21:07:30.160079886Z" level=error msg="encountered an error cleaning up failed sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160427 containerd[1650]: time="2025-01-13T21:07:30.160368645Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:6,} failed, error" error="failed to setup network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160427 containerd[1650]: time="2025-01-13T21:07:30.160105686Z" level=error msg="Failed to destroy network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160591 containerd[1650]: time="2025-01-13T21:07:30.160579240Z" level=error msg="encountered an error cleaning up failed sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160638 containerd[1650]: time="2025-01-13T21:07:30.160628379Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160734 containerd[1650]: time="2025-01-13T21:07:30.160127167Z" level=error msg="Failed to destroy network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160939 containerd[1650]: time="2025-01-13T21:07:30.160879390Z" level=error msg="encountered an error cleaning up failed sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.160939 containerd[1650]: time="2025-01-13T21:07:30.160900708Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:6,} failed, error" error="failed to setup network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.161143 kubelet[3044]: E0113 21:07:30.161129 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.161772 kubelet[3044]: E0113 21:07:30.161264 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:30.161772 kubelet[3044]: E0113 21:07:30.161280 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" Jan 13 21:07:30.161772 kubelet[3044]: E0113 21:07:30.161313 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-9bjj8_calico-apiserver(27d57ea0-66c3-451f-b6eb-d7050a90a389)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podUID="27d57ea0-66c3-451f-b6eb-d7050a90a389" Jan 13 21:07:30.162071 kubelet[3044]: E0113 21:07:30.161204 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.162071 kubelet[3044]: E0113 21:07:30.161928 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:30.162071 kubelet[3044]: E0113 21:07:30.161945 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-x4zz7" Jan 13 21:07:30.162133 kubelet[3044]: E0113 21:07:30.161971 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-x4zz7_calico-system(59df9a6a-247e-477f-9e6c-e3512a44f477)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-x4zz7" podUID="59df9a6a-247e-477f-9e6c-e3512a44f477" Jan 13 21:07:30.162133 kubelet[3044]: E0113 21:07:30.161217 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.162133 kubelet[3044]: E0113 21:07:30.161989 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:30.162206 kubelet[3044]: E0113 21:07:30.162000 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-76f75df574-bbgth" Jan 13 21:07:30.162206 kubelet[3044]: E0113 21:07:30.162018 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-76f75df574-bbgth_kube-system(843fc70f-b236-4057-8b13-ac6f976b6914)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-76f75df574-bbgth" podUID="843fc70f-b236-4057-8b13-ac6f976b6914" Jan 13 21:07:30.162206 kubelet[3044]: E0113 21:07:30.161230 3044 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 13 21:07:30.162275 kubelet[3044]: E0113 21:07:30.162032 3044 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:30.162275 kubelet[3044]: E0113 21:07:30.162042 3044 kuberuntime_manager.go:1172] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" Jan 13 21:07:30.162275 kubelet[3044]: E0113 21:07:30.162058 3044 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7b989c8495-wsw2f_calico-apiserver(ddbbefea-15ee-4654-983a-5f453cee91ee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podUID="ddbbefea-15ee-4654-983a-5f453cee91ee" Jan 13 21:07:30.195400 containerd[1650]: time="2025-01-13T21:07:30.195367252Z" level=info msg="CreateContainer within sandbox \"c16a2b495884583b49887e9a3180e791003d36fc751f451766c61d91fdb3041b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4846e8747f1d48febe1f48fba68de265a714ffe9d08756d4a1b0071d7c82cee8\"" Jan 13 21:07:30.198955 containerd[1650]: time="2025-01-13T21:07:30.198934053Z" level=info msg="StartContainer for \"4846e8747f1d48febe1f48fba68de265a714ffe9d08756d4a1b0071d7c82cee8\"" Jan 13 21:07:30.309434 containerd[1650]: time="2025-01-13T21:07:30.308418169Z" level=info msg="StartContainer for \"4846e8747f1d48febe1f48fba68de265a714ffe9d08756d4a1b0071d7c82cee8\" returns successfully" Jan 13 21:07:30.415647 systemd[1]: run-netns-cni\x2d6dcc4e54\x2d04de\x2d738e\x2d3321\x2d9249cf4479b1.mount: Deactivated successfully. Jan 13 21:07:30.415843 systemd[1]: run-netns-cni\x2d6f8da294\x2db5db\x2dd912\x2d2679\x2ddb3a65923931.mount: Deactivated successfully. Jan 13 21:07:30.415960 systemd[1]: run-netns-cni\x2df17ee3a5\x2d1037\x2ddb5e\x2dfe21\x2de7debd12dc32.mount: Deactivated successfully. Jan 13 21:07:30.850199 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 13 21:07:30.850361 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 13 21:07:30.977971 kubelet[3044]: I0113 21:07:30.977689 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148" Jan 13 21:07:30.979454 containerd[1650]: time="2025-01-13T21:07:30.978333382Z" level=info msg="StopPodSandbox for \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\"" Jan 13 21:07:30.979454 containerd[1650]: time="2025-01-13T21:07:30.978442789Z" level=info msg="Ensure that sandbox 8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148 in task-service has been cleanup successfully" Jan 13 21:07:30.979454 containerd[1650]: time="2025-01-13T21:07:30.978964647Z" level=info msg="TearDown network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\" successfully" Jan 13 21:07:30.979454 containerd[1650]: time="2025-01-13T21:07:30.978985449Z" level=info msg="StopPodSandbox for \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\" returns successfully" Jan 13 21:07:30.979454 containerd[1650]: time="2025-01-13T21:07:30.979162211Z" level=info msg="StopPodSandbox for \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\"" Jan 13 21:07:30.981194 systemd[1]: run-netns-cni\x2d10ed6cf0\x2d16b1\x2d182e\x2d413a\x2d1d5fb79449b3.mount: Deactivated successfully. Jan 13 21:07:30.982902 containerd[1650]: time="2025-01-13T21:07:30.982520308Z" level=info msg="TearDown network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" successfully" Jan 13 21:07:30.982902 containerd[1650]: time="2025-01-13T21:07:30.982545739Z" level=info msg="StopPodSandbox for \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" returns successfully" Jan 13 21:07:30.984177 containerd[1650]: time="2025-01-13T21:07:30.984150170Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\"" Jan 13 21:07:30.984216 containerd[1650]: time="2025-01-13T21:07:30.984206084Z" level=info msg="TearDown network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" successfully" Jan 13 21:07:30.984248 containerd[1650]: time="2025-01-13T21:07:30.984215609Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" returns successfully" Jan 13 21:07:30.984552 containerd[1650]: time="2025-01-13T21:07:30.984475548Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" Jan 13 21:07:30.984628 containerd[1650]: time="2025-01-13T21:07:30.984589168Z" level=info msg="TearDown network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" successfully" Jan 13 21:07:30.984628 containerd[1650]: time="2025-01-13T21:07:30.984599628Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" returns successfully" Jan 13 21:07:30.985511 containerd[1650]: time="2025-01-13T21:07:30.985224822Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:30.985630 containerd[1650]: time="2025-01-13T21:07:30.985596118Z" level=info msg="TearDown network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" successfully" Jan 13 21:07:30.985630 containerd[1650]: time="2025-01-13T21:07:30.985606566Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" returns successfully" Jan 13 21:07:30.986106 containerd[1650]: time="2025-01-13T21:07:30.986065981Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:30.986312 containerd[1650]: time="2025-01-13T21:07:30.986298847Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:30.986447 containerd[1650]: time="2025-01-13T21:07:30.986309439Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:30.987420 containerd[1650]: time="2025-01-13T21:07:30.987403513Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:30.987467 containerd[1650]: time="2025-01-13T21:07:30.987454077Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:30.987467 containerd[1650]: time="2025-01-13T21:07:30.987464108Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:30.987780 containerd[1650]: time="2025-01-13T21:07:30.987760786Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:7,}" Jan 13 21:07:30.991616 kubelet[3044]: I0113 21:07:30.991278 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb" Jan 13 21:07:30.992372 containerd[1650]: time="2025-01-13T21:07:30.992201003Z" level=info msg="StopPodSandbox for \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\"" Jan 13 21:07:30.992372 containerd[1650]: time="2025-01-13T21:07:30.992332842Z" level=info msg="Ensure that sandbox 019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb in task-service has been cleanup successfully" Jan 13 21:07:30.994254 systemd[1]: run-netns-cni\x2d64eb0110\x2da968\x2d5e21\x2dfe9d\x2d805b5af279d1.mount: Deactivated successfully. Jan 13 21:07:30.995349 containerd[1650]: time="2025-01-13T21:07:30.995110778Z" level=info msg="TearDown network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\" successfully" Jan 13 21:07:30.995349 containerd[1650]: time="2025-01-13T21:07:30.995123434Z" level=info msg="StopPodSandbox for \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\" returns successfully" Jan 13 21:07:30.995900 containerd[1650]: time="2025-01-13T21:07:30.995788448Z" level=info msg="StopPodSandbox for \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\"" Jan 13 21:07:30.996040 containerd[1650]: time="2025-01-13T21:07:30.995944251Z" level=info msg="TearDown network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" successfully" Jan 13 21:07:30.996040 containerd[1650]: time="2025-01-13T21:07:30.995954143Z" level=info msg="StopPodSandbox for \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" returns successfully" Jan 13 21:07:30.996675 containerd[1650]: time="2025-01-13T21:07:30.996604590Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\"" Jan 13 21:07:30.996675 containerd[1650]: time="2025-01-13T21:07:30.996652335Z" level=info msg="TearDown network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" successfully" Jan 13 21:07:30.996675 containerd[1650]: time="2025-01-13T21:07:30.996658838Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" returns successfully" Jan 13 21:07:30.997331 containerd[1650]: time="2025-01-13T21:07:30.997228690Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" Jan 13 21:07:30.997331 containerd[1650]: time="2025-01-13T21:07:30.997278878Z" level=info msg="TearDown network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" successfully" Jan 13 21:07:30.997331 containerd[1650]: time="2025-01-13T21:07:30.997286909Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" returns successfully" Jan 13 21:07:30.999302 containerd[1650]: time="2025-01-13T21:07:30.999187670Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:30.999302 containerd[1650]: time="2025-01-13T21:07:30.999250120Z" level=info msg="TearDown network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" successfully" Jan 13 21:07:30.999302 containerd[1650]: time="2025-01-13T21:07:30.999259251Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" returns successfully" Jan 13 21:07:31.000287 containerd[1650]: time="2025-01-13T21:07:31.000184509Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:31.000407 containerd[1650]: time="2025-01-13T21:07:31.000360231Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:31.000407 containerd[1650]: time="2025-01-13T21:07:31.000370222Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:31.001043 containerd[1650]: time="2025-01-13T21:07:31.000941532Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:31.001129 containerd[1650]: time="2025-01-13T21:07:31.001002114Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:31.001129 containerd[1650]: time="2025-01-13T21:07:31.001085853Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:31.001949 containerd[1650]: time="2025-01-13T21:07:31.001902051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:7,}" Jan 13 21:07:31.003817 kubelet[3044]: I0113 21:07:31.003069 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6" Jan 13 21:07:31.007691 containerd[1650]: time="2025-01-13T21:07:31.007664727Z" level=info msg="StopPodSandbox for \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\"" Jan 13 21:07:31.009782 containerd[1650]: time="2025-01-13T21:07:31.007817379Z" level=info msg="Ensure that sandbox f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6 in task-service has been cleanup successfully" Jan 13 21:07:31.009782 containerd[1650]: time="2025-01-13T21:07:31.007924051Z" level=info msg="TearDown network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\" successfully" Jan 13 21:07:31.009782 containerd[1650]: time="2025-01-13T21:07:31.007932007Z" level=info msg="StopPodSandbox for \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\" returns successfully" Jan 13 21:07:31.014545 systemd[1]: run-netns-cni\x2d0bf49732\x2d7f3d\x2dc19c\x2db91a\x2d929dc67e0d11.mount: Deactivated successfully. Jan 13 21:07:31.019782 containerd[1650]: time="2025-01-13T21:07:31.019694141Z" level=info msg="StopPodSandbox for \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\"" Jan 13 21:07:31.020363 containerd[1650]: time="2025-01-13T21:07:31.020218830Z" level=info msg="TearDown network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" successfully" Jan 13 21:07:31.020363 containerd[1650]: time="2025-01-13T21:07:31.020231015Z" level=info msg="StopPodSandbox for \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" returns successfully" Jan 13 21:07:31.029443 containerd[1650]: time="2025-01-13T21:07:31.029016522Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\"" Jan 13 21:07:31.029809 containerd[1650]: time="2025-01-13T21:07:31.029765305Z" level=info msg="TearDown network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" successfully" Jan 13 21:07:31.029905 containerd[1650]: time="2025-01-13T21:07:31.029799587Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" returns successfully" Jan 13 21:07:31.030269 containerd[1650]: time="2025-01-13T21:07:31.030254961Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" Jan 13 21:07:31.030423 containerd[1650]: time="2025-01-13T21:07:31.030347982Z" level=info msg="TearDown network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" successfully" Jan 13 21:07:31.030423 containerd[1650]: time="2025-01-13T21:07:31.030356857Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" returns successfully" Jan 13 21:07:31.030699 containerd[1650]: time="2025-01-13T21:07:31.030626720Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:31.031364 containerd[1650]: time="2025-01-13T21:07:31.031251261Z" level=info msg="TearDown network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" successfully" Jan 13 21:07:31.031364 containerd[1650]: time="2025-01-13T21:07:31.031262645Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" returns successfully" Jan 13 21:07:31.034004 containerd[1650]: time="2025-01-13T21:07:31.033480523Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:31.034214 containerd[1650]: time="2025-01-13T21:07:31.034184624Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:31.034214 containerd[1650]: time="2025-01-13T21:07:31.034195029Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:31.035735 containerd[1650]: time="2025-01-13T21:07:31.035622389Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:31.036208 containerd[1650]: time="2025-01-13T21:07:31.036131589Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:31.036208 containerd[1650]: time="2025-01-13T21:07:31.036142053Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:31.037342 containerd[1650]: time="2025-01-13T21:07:31.037330213Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:7,}" Jan 13 21:07:31.038036 kubelet[3044]: I0113 21:07:31.038006 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df" Jan 13 21:07:31.043550 containerd[1650]: time="2025-01-13T21:07:31.043020641Z" level=info msg="StopPodSandbox for \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\"" Jan 13 21:07:31.043550 containerd[1650]: time="2025-01-13T21:07:31.043155032Z" level=info msg="Ensure that sandbox 02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df in task-service has been cleanup successfully" Jan 13 21:07:31.043550 containerd[1650]: time="2025-01-13T21:07:31.043310651Z" level=info msg="TearDown network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\" successfully" Jan 13 21:07:31.043550 containerd[1650]: time="2025-01-13T21:07:31.043319087Z" level=info msg="StopPodSandbox for \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\" returns successfully" Jan 13 21:07:31.044218 containerd[1650]: time="2025-01-13T21:07:31.043980862Z" level=info msg="StopPodSandbox for \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\"" Jan 13 21:07:31.044218 containerd[1650]: time="2025-01-13T21:07:31.044020962Z" level=info msg="TearDown network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" successfully" Jan 13 21:07:31.044218 containerd[1650]: time="2025-01-13T21:07:31.044027656Z" level=info msg="StopPodSandbox for \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" returns successfully" Jan 13 21:07:31.044431 containerd[1650]: time="2025-01-13T21:07:31.044389966Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\"" Jan 13 21:07:31.044457 containerd[1650]: time="2025-01-13T21:07:31.044431528Z" level=info msg="TearDown network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" successfully" Jan 13 21:07:31.044457 containerd[1650]: time="2025-01-13T21:07:31.044437757Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" returns successfully" Jan 13 21:07:31.045129 containerd[1650]: time="2025-01-13T21:07:31.045014318Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" Jan 13 21:07:31.045129 containerd[1650]: time="2025-01-13T21:07:31.045061441Z" level=info msg="TearDown network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" successfully" Jan 13 21:07:31.045129 containerd[1650]: time="2025-01-13T21:07:31.045067982Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" returns successfully" Jan 13 21:07:31.045514 containerd[1650]: time="2025-01-13T21:07:31.045503452Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:31.045600 containerd[1650]: time="2025-01-13T21:07:31.045592056Z" level=info msg="TearDown network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" successfully" Jan 13 21:07:31.045697 containerd[1650]: time="2025-01-13T21:07:31.045689537Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" returns successfully" Jan 13 21:07:31.046685 containerd[1650]: time="2025-01-13T21:07:31.046668596Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:31.047210 containerd[1650]: time="2025-01-13T21:07:31.047185618Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:31.047293 containerd[1650]: time="2025-01-13T21:07:31.047252558Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:31.048188 containerd[1650]: time="2025-01-13T21:07:31.048087531Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:31.048905 containerd[1650]: time="2025-01-13T21:07:31.048882426Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:31.048905 containerd[1650]: time="2025-01-13T21:07:31.048893976Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:31.050311 containerd[1650]: time="2025-01-13T21:07:31.049964369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:7,}" Jan 13 21:07:31.051690 kubelet[3044]: I0113 21:07:31.051630 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc" Jan 13 21:07:31.053968 containerd[1650]: time="2025-01-13T21:07:31.053863248Z" level=info msg="StopPodSandbox for \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\"" Jan 13 21:07:31.054224 containerd[1650]: time="2025-01-13T21:07:31.054042024Z" level=info msg="Ensure that sandbox 1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc in task-service has been cleanup successfully" Jan 13 21:07:31.054957 containerd[1650]: time="2025-01-13T21:07:31.054937178Z" level=info msg="TearDown network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\" successfully" Jan 13 21:07:31.054957 containerd[1650]: time="2025-01-13T21:07:31.054948538Z" level=info msg="StopPodSandbox for \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\" returns successfully" Jan 13 21:07:31.056827 containerd[1650]: time="2025-01-13T21:07:31.056581919Z" level=info msg="StopPodSandbox for \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\"" Jan 13 21:07:31.056827 containerd[1650]: time="2025-01-13T21:07:31.056631447Z" level=info msg="TearDown network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" successfully" Jan 13 21:07:31.056827 containerd[1650]: time="2025-01-13T21:07:31.056638042Z" level=info msg="StopPodSandbox for \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" returns successfully" Jan 13 21:07:31.058324 kubelet[3044]: I0113 21:07:31.058306 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-node-sqrpt" podStartSLOduration=2.566295329 podStartE2EDuration="20.017770593s" podCreationTimestamp="2025-01-13 21:07:11 +0000 UTC" firstStartedPulling="2025-01-13 21:07:12.484894074 +0000 UTC m=+20.180442561" lastFinishedPulling="2025-01-13 21:07:29.936369338 +0000 UTC m=+37.631917825" observedRunningTime="2025-01-13 21:07:31.017576561 +0000 UTC m=+38.713125057" watchObservedRunningTime="2025-01-13 21:07:31.017770593 +0000 UTC m=+38.713319090" Jan 13 21:07:31.058708 containerd[1650]: time="2025-01-13T21:07:31.058694644Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\"" Jan 13 21:07:31.059046 containerd[1650]: time="2025-01-13T21:07:31.058797038Z" level=info msg="TearDown network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" successfully" Jan 13 21:07:31.059046 containerd[1650]: time="2025-01-13T21:07:31.058984105Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" returns successfully" Jan 13 21:07:31.061329 containerd[1650]: time="2025-01-13T21:07:31.061309468Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" Jan 13 21:07:31.062443 containerd[1650]: time="2025-01-13T21:07:31.061971333Z" level=info msg="TearDown network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" successfully" Jan 13 21:07:31.062443 containerd[1650]: time="2025-01-13T21:07:31.061983273Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" returns successfully" Jan 13 21:07:31.062443 containerd[1650]: time="2025-01-13T21:07:31.062223856Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:31.062443 containerd[1650]: time="2025-01-13T21:07:31.062265689Z" level=info msg="TearDown network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" successfully" Jan 13 21:07:31.062443 containerd[1650]: time="2025-01-13T21:07:31.062271674Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" returns successfully" Jan 13 21:07:31.063054 containerd[1650]: time="2025-01-13T21:07:31.062925704Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:31.063054 containerd[1650]: time="2025-01-13T21:07:31.062967593Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:31.063054 containerd[1650]: time="2025-01-13T21:07:31.062974455Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:31.064190 containerd[1650]: time="2025-01-13T21:07:31.064007621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:6,}" Jan 13 21:07:31.066767 kubelet[3044]: I0113 21:07:31.066750 3044 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143" Jan 13 21:07:31.067777 containerd[1650]: time="2025-01-13T21:07:31.067662940Z" level=info msg="StopPodSandbox for \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\"" Jan 13 21:07:31.068856 containerd[1650]: time="2025-01-13T21:07:31.068518676Z" level=info msg="Ensure that sandbox 1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143 in task-service has been cleanup successfully" Jan 13 21:07:31.075087 containerd[1650]: time="2025-01-13T21:07:31.075025905Z" level=info msg="TearDown network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\" successfully" Jan 13 21:07:31.075087 containerd[1650]: time="2025-01-13T21:07:31.075044559Z" level=info msg="StopPodSandbox for \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\" returns successfully" Jan 13 21:07:31.075827 containerd[1650]: time="2025-01-13T21:07:31.075772533Z" level=info msg="StopPodSandbox for \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\"" Jan 13 21:07:31.075967 containerd[1650]: time="2025-01-13T21:07:31.075944889Z" level=info msg="TearDown network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" successfully" Jan 13 21:07:31.075967 containerd[1650]: time="2025-01-13T21:07:31.075954094Z" level=info msg="StopPodSandbox for \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" returns successfully" Jan 13 21:07:31.076690 containerd[1650]: time="2025-01-13T21:07:31.076643040Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\"" Jan 13 21:07:31.076793 containerd[1650]: time="2025-01-13T21:07:31.076784832Z" level=info msg="TearDown network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" successfully" Jan 13 21:07:31.076992 containerd[1650]: time="2025-01-13T21:07:31.076955727Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" returns successfully" Jan 13 21:07:31.077295 containerd[1650]: time="2025-01-13T21:07:31.077284657Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" Jan 13 21:07:31.077577 containerd[1650]: time="2025-01-13T21:07:31.077552269Z" level=info msg="TearDown network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" successfully" Jan 13 21:07:31.077577 containerd[1650]: time="2025-01-13T21:07:31.077562146Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" returns successfully" Jan 13 21:07:31.077942 containerd[1650]: time="2025-01-13T21:07:31.077928254Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:31.077989 containerd[1650]: time="2025-01-13T21:07:31.077972195Z" level=info msg="TearDown network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" successfully" Jan 13 21:07:31.077989 containerd[1650]: time="2025-01-13T21:07:31.077978826Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" returns successfully" Jan 13 21:07:31.078577 containerd[1650]: time="2025-01-13T21:07:31.078560785Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:31.078634 containerd[1650]: time="2025-01-13T21:07:31.078602913Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:31.078634 containerd[1650]: time="2025-01-13T21:07:31.078610581Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:31.079131 containerd[1650]: time="2025-01-13T21:07:31.078897283Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:31.079131 containerd[1650]: time="2025-01-13T21:07:31.079110588Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:31.079131 containerd[1650]: time="2025-01-13T21:07:31.079118454Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:31.079733 containerd[1650]: time="2025-01-13T21:07:31.079706851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:7,}" Jan 13 21:07:31.421022 systemd[1]: run-netns-cni\x2d513f43ab\x2d435a\x2d16bd\x2dc011\x2d7cd76aee4abf.mount: Deactivated successfully. Jan 13 21:07:31.421114 systemd[1]: run-netns-cni\x2db63308c8\x2d47a9\x2d6ce3\x2dcccb\x2dc2a434400282.mount: Deactivated successfully. Jan 13 21:07:31.421175 systemd[1]: run-netns-cni\x2da9f739cb\x2d6bff\x2d23ea\x2d92e0\x2d184b0b027354.mount: Deactivated successfully. Jan 13 21:07:31.477665 systemd-networkd[1293]: cali791e56694a1: Link UP Jan 13 21:07:31.477765 systemd-networkd[1293]: cali791e56694a1: Gained carrier Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.151 [INFO][5094] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.163 [INFO][5094] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--76f75df574--bhkcp-eth0 coredns-76f75df574- kube-system 000eb469-bf65-41e7-a410-d2b847f032c6 692 0 2025-01-13 21:07:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-76f75df574-bhkcp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali791e56694a1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.163 [INFO][5094] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-eth0" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.419 [INFO][5145] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" HandleID="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Workload="localhost-k8s-coredns--76f75df574--bhkcp-eth0" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.443 [INFO][5145] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" HandleID="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Workload="localhost-k8s-coredns--76f75df574--bhkcp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003c20c0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-76f75df574-bhkcp", "timestamp":"2025-01-13 21:07:31.419841794 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.443 [INFO][5145] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.444 [INFO][5145] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.444 [INFO][5145] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.445 [INFO][5145] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.451 [INFO][5145] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.453 [INFO][5145] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.453 [INFO][5145] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.454 [INFO][5145] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.455 [INFO][5145] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.455 [INFO][5145] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0 Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.457 [INFO][5145] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.460 [INFO][5145] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.460 [INFO][5145] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" host="localhost" Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.460 [INFO][5145] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:07:31.487353 containerd[1650]: 2025-01-13 21:07:31.460 [INFO][5145] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" HandleID="k8s-pod-network.c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Workload="localhost-k8s-coredns--76f75df574--bhkcp-eth0" Jan 13 21:07:31.487848 containerd[1650]: 2025-01-13 21:07:31.462 [INFO][5094] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--bhkcp-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"000eb469-bf65-41e7-a410-d2b847f032c6", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-76f75df574-bhkcp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali791e56694a1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.487848 containerd[1650]: 2025-01-13 21:07:31.462 [INFO][5094] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-eth0" Jan 13 21:07:31.487848 containerd[1650]: 2025-01-13 21:07:31.462 [INFO][5094] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali791e56694a1 ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-eth0" Jan 13 21:07:31.487848 containerd[1650]: 2025-01-13 21:07:31.471 [INFO][5094] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-eth0" Jan 13 21:07:31.487848 containerd[1650]: 2025-01-13 21:07:31.471 [INFO][5094] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--bhkcp-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"000eb469-bf65-41e7-a410-d2b847f032c6", ResourceVersion:"692", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0", Pod:"coredns-76f75df574-bhkcp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali791e56694a1", MAC:"b6:66:66:95:79:58", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.489116 containerd[1650]: 2025-01-13 21:07:31.480 [INFO][5094] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0" Namespace="kube-system" Pod="coredns-76f75df574-bhkcp" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bhkcp-eth0" Jan 13 21:07:31.492490 systemd-networkd[1293]: calib48dc55c397: Link UP Jan 13 21:07:31.493308 systemd-networkd[1293]: calib48dc55c397: Gained carrier Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.111 [INFO][5076] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.149 [INFO][5076] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0 calico-apiserver-7b989c8495- calico-apiserver 27d57ea0-66c3-451f-b6eb-d7050a90a389 693 0 2025-01-13 21:07:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7b989c8495 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7b989c8495-9bjj8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calib48dc55c397 [] []}} ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.149 [INFO][5076] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.419 [INFO][5147] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" HandleID="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Workload="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.444 [INFO][5147] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" HandleID="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Workload="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003960e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7b989c8495-9bjj8", "timestamp":"2025-01-13 21:07:31.419567225 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.444 [INFO][5147] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.460 [INFO][5147] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.461 [INFO][5147] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.462 [INFO][5147] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.465 [INFO][5147] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.467 [INFO][5147] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.469 [INFO][5147] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.472 [INFO][5147] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.472 [INFO][5147] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.473 [INFO][5147] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1 Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.476 [INFO][5147] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.482 [INFO][5147] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.483 [INFO][5147] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" host="localhost" Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.483 [INFO][5147] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:07:31.506669 containerd[1650]: 2025-01-13 21:07:31.483 [INFO][5147] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" HandleID="k8s-pod-network.ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Workload="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" Jan 13 21:07:31.509507 containerd[1650]: 2025-01-13 21:07:31.488 [INFO][5076] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0", GenerateName:"calico-apiserver-7b989c8495-", Namespace:"calico-apiserver", SelfLink:"", UID:"27d57ea0-66c3-451f-b6eb-d7050a90a389", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7b989c8495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7b989c8495-9bjj8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib48dc55c397", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.509507 containerd[1650]: 2025-01-13 21:07:31.489 [INFO][5076] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" Jan 13 21:07:31.509507 containerd[1650]: 2025-01-13 21:07:31.489 [INFO][5076] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib48dc55c397 ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" Jan 13 21:07:31.509507 containerd[1650]: 2025-01-13 21:07:31.493 [INFO][5076] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" Jan 13 21:07:31.509507 containerd[1650]: 2025-01-13 21:07:31.493 [INFO][5076] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0", GenerateName:"calico-apiserver-7b989c8495-", Namespace:"calico-apiserver", SelfLink:"", UID:"27d57ea0-66c3-451f-b6eb-d7050a90a389", ResourceVersion:"693", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7b989c8495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1", Pod:"calico-apiserver-7b989c8495-9bjj8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calib48dc55c397", MAC:"02:c7:77:82:4f:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.509507 containerd[1650]: 2025-01-13 21:07:31.502 [INFO][5076] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-9bjj8" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--9bjj8-eth0" Jan 13 21:07:31.538788 systemd-networkd[1293]: calibe6f4d5991a: Link UP Jan 13 21:07:31.539375 systemd-networkd[1293]: calibe6f4d5991a: Gained carrier Jan 13 21:07:31.544967 containerd[1650]: time="2025-01-13T21:07:31.544907300Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:31.545090 containerd[1650]: time="2025-01-13T21:07:31.545074596Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:31.545170 containerd[1650]: time="2025-01-13T21:07:31.545150638Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.545428 containerd[1650]: time="2025-01-13T21:07:31.545406151Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.561582 systemd-networkd[1293]: cali36b4a776911: Link UP Jan 13 21:07:31.562290 systemd-networkd[1293]: cali36b4a776911: Gained carrier Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.080 [INFO][5058] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.148 [INFO][5058] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--76f75df574--bbgth-eth0 coredns-76f75df574- kube-system 843fc70f-b236-4057-8b13-ac6f976b6914 687 0 2025-01-13 21:07:06 +0000 UTC map[k8s-app:kube-dns pod-template-hash:76f75df574 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-76f75df574-bbgth eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calibe6f4d5991a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.148 [INFO][5058] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-eth0" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.418 [INFO][5141] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" HandleID="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Workload="localhost-k8s-coredns--76f75df574--bbgth-eth0" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.442 [INFO][5141] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" HandleID="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Workload="localhost-k8s-coredns--76f75df574--bbgth-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000408ac0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-76f75df574-bbgth", "timestamp":"2025-01-13 21:07:31.418777032 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.442 [INFO][5141] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.483 [INFO][5141] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.483 [INFO][5141] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.485 [INFO][5141] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.491 [INFO][5141] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.499 [INFO][5141] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.503 [INFO][5141] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.505 [INFO][5141] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.505 [INFO][5141] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.506 [INFO][5141] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.509 [INFO][5141] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.513 [INFO][5141] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.515 [INFO][5141] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" host="localhost" Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.515 [INFO][5141] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:07:31.566954 containerd[1650]: 2025-01-13 21:07:31.515 [INFO][5141] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" HandleID="k8s-pod-network.6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Workload="localhost-k8s-coredns--76f75df574--bbgth-eth0" Jan 13 21:07:31.569341 containerd[1650]: 2025-01-13 21:07:31.526 [INFO][5058] cni-plugin/k8s.go 386: Populated endpoint ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--bbgth-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"843fc70f-b236-4057-8b13-ac6f976b6914", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-76f75df574-bbgth", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibe6f4d5991a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.569341 containerd[1650]: 2025-01-13 21:07:31.528 [INFO][5058] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-eth0" Jan 13 21:07:31.569341 containerd[1650]: 2025-01-13 21:07:31.532 [INFO][5058] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibe6f4d5991a ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-eth0" Jan 13 21:07:31.569341 containerd[1650]: 2025-01-13 21:07:31.539 [INFO][5058] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-eth0" Jan 13 21:07:31.569341 containerd[1650]: 2025-01-13 21:07:31.540 [INFO][5058] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--76f75df574--bbgth-eth0", GenerateName:"coredns-76f75df574-", Namespace:"kube-system", SelfLink:"", UID:"843fc70f-b236-4057-8b13-ac6f976b6914", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"76f75df574", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af", Pod:"coredns-76f75df574-bbgth", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calibe6f4d5991a", MAC:"e2:22:c0:55:6a:d8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.571915 containerd[1650]: 2025-01-13 21:07:31.553 [INFO][5058] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af" Namespace="kube-system" Pod="coredns-76f75df574-bbgth" WorkloadEndpoint="localhost-k8s-coredns--76f75df574--bbgth-eth0" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.110 [INFO][5067] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.148 [INFO][5067] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0 calico-apiserver-7b989c8495- calico-apiserver ddbbefea-15ee-4654-983a-5f453cee91ee 691 0 2025-01-13 21:07:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7b989c8495 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-7b989c8495-wsw2f eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali36b4a776911 [] []}} ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.150 [INFO][5067] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.419 [INFO][5144] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" HandleID="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Workload="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.442 [INFO][5144] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" HandleID="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Workload="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050170), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-7b989c8495-wsw2f", "timestamp":"2025-01-13 21:07:31.419715481 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.442 [INFO][5144] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.515 [INFO][5144] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.515 [INFO][5144] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.518 [INFO][5144] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.525 [INFO][5144] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.528 [INFO][5144] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.530 [INFO][5144] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.538 [INFO][5144] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.538 [INFO][5144] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.541 [INFO][5144] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4 Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.546 [INFO][5144] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.553 [INFO][5144] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.553 [INFO][5144] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" host="localhost" Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.553 [INFO][5144] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:07:31.579893 containerd[1650]: 2025-01-13 21:07:31.554 [INFO][5144] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" HandleID="k8s-pod-network.cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Workload="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" Jan 13 21:07:31.580315 containerd[1650]: 2025-01-13 21:07:31.556 [INFO][5067] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0", GenerateName:"calico-apiserver-7b989c8495-", Namespace:"calico-apiserver", SelfLink:"", UID:"ddbbefea-15ee-4654-983a-5f453cee91ee", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7b989c8495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-7b989c8495-wsw2f", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali36b4a776911", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.580315 containerd[1650]: 2025-01-13 21:07:31.556 [INFO][5067] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" Jan 13 21:07:31.580315 containerd[1650]: 2025-01-13 21:07:31.556 [INFO][5067] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali36b4a776911 ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" Jan 13 21:07:31.580315 containerd[1650]: 2025-01-13 21:07:31.564 [INFO][5067] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" Jan 13 21:07:31.580315 containerd[1650]: 2025-01-13 21:07:31.565 [INFO][5067] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0", GenerateName:"calico-apiserver-7b989c8495-", Namespace:"calico-apiserver", SelfLink:"", UID:"ddbbefea-15ee-4654-983a-5f453cee91ee", ResourceVersion:"691", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7b989c8495", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4", Pod:"calico-apiserver-7b989c8495-wsw2f", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali36b4a776911", MAC:"e6:b0:87:18:6b:3d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.580315 containerd[1650]: 2025-01-13 21:07:31.575 [INFO][5067] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4" Namespace="calico-apiserver" Pod="calico-apiserver-7b989c8495-wsw2f" WorkloadEndpoint="localhost-k8s-calico--apiserver--7b989c8495--wsw2f-eth0" Jan 13 21:07:31.589938 systemd-resolved[1547]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 21:07:31.592495 containerd[1650]: time="2025-01-13T21:07:31.590958075Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:31.596655 containerd[1650]: time="2025-01-13T21:07:31.593067538Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:31.596655 containerd[1650]: time="2025-01-13T21:07:31.593083076Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.610726 containerd[1650]: time="2025-01-13T21:07:31.597155155Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.633746 systemd-networkd[1293]: calib26cc89f0b4: Link UP Jan 13 21:07:31.639984 systemd-networkd[1293]: calib26cc89f0b4: Gained carrier Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.156 [INFO][5088] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.175 [INFO][5088] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--x4zz7-eth0 csi-node-driver- calico-system 59df9a6a-247e-477f-9e6c-e3512a44f477 593 0 2025-01-13 21:07:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b695c467 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-x4zz7 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib26cc89f0b4 [] []}} ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.175 [INFO][5088] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-eth0" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.418 [INFO][5149] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" HandleID="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Workload="localhost-k8s-csi--node--driver--x4zz7-eth0" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.443 [INFO][5149] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" HandleID="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Workload="localhost-k8s-csi--node--driver--x4zz7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039f0f0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-x4zz7", "timestamp":"2025-01-13 21:07:31.41824353 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.443 [INFO][5149] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.554 [INFO][5149] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.554 [INFO][5149] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.558 [INFO][5149] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.576 [INFO][5149] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.584 [INFO][5149] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.586 [INFO][5149] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.589 [INFO][5149] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.589 [INFO][5149] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.591 [INFO][5149] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1 Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.598 [INFO][5149] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.614 [INFO][5149] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.615 [INFO][5149] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" host="localhost" Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.615 [INFO][5149] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:07:31.652608 containerd[1650]: 2025-01-13 21:07:31.615 [INFO][5149] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" HandleID="k8s-pod-network.b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Workload="localhost-k8s-csi--node--driver--x4zz7-eth0" Jan 13 21:07:31.653740 containerd[1650]: 2025-01-13 21:07:31.624 [INFO][5088] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--x4zz7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"59df9a6a-247e-477f-9e6c-e3512a44f477", ResourceVersion:"593", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-x4zz7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib26cc89f0b4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.653740 containerd[1650]: 2025-01-13 21:07:31.624 [INFO][5088] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-eth0" Jan 13 21:07:31.653740 containerd[1650]: 2025-01-13 21:07:31.625 [INFO][5088] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib26cc89f0b4 ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-eth0" Jan 13 21:07:31.653740 containerd[1650]: 2025-01-13 21:07:31.639 [INFO][5088] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-eth0" Jan 13 21:07:31.653740 containerd[1650]: 2025-01-13 21:07:31.641 [INFO][5088] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--x4zz7-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"59df9a6a-247e-477f-9e6c-e3512a44f477", ResourceVersion:"593", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b695c467", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1", Pod:"csi-node-driver-x4zz7", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib26cc89f0b4", MAC:"2a:a5:a1:e7:c9:3d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.653740 containerd[1650]: 2025-01-13 21:07:31.647 [INFO][5088] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1" Namespace="calico-system" Pod="csi-node-driver-x4zz7" WorkloadEndpoint="localhost-k8s-csi--node--driver--x4zz7-eth0" Jan 13 21:07:31.657900 containerd[1650]: time="2025-01-13T21:07:31.656780096Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:31.657900 containerd[1650]: time="2025-01-13T21:07:31.656830736Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:31.657900 containerd[1650]: time="2025-01-13T21:07:31.656855489Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.657900 containerd[1650]: time="2025-01-13T21:07:31.656903481Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.660862 systemd-resolved[1547]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 21:07:31.670632 containerd[1650]: time="2025-01-13T21:07:31.670302554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bhkcp,Uid:000eb469-bf65-41e7-a410-d2b847f032c6,Namespace:kube-system,Attempt:7,} returns sandbox id \"c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0\"" Jan 13 21:07:31.679851 containerd[1650]: time="2025-01-13T21:07:31.674722230Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:31.679851 containerd[1650]: time="2025-01-13T21:07:31.674798625Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:31.679851 containerd[1650]: time="2025-01-13T21:07:31.674999616Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.679851 containerd[1650]: time="2025-01-13T21:07:31.675069062Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.684909 containerd[1650]: time="2025-01-13T21:07:31.684452302Z" level=info msg="CreateContainer within sandbox \"c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 21:07:31.684535 systemd-networkd[1293]: cali7de5c24b94f: Link UP Jan 13 21:07:31.685510 systemd-networkd[1293]: cali7de5c24b94f: Gained carrier Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.166 [INFO][5116] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.179 [INFO][5116] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0 calico-kube-controllers-76db674964- calico-system 78856142-3d1a-4f19-a50a-e5f8505c2330 690 0 2025-01-13 21:07:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:76db674964 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-76db674964-8lv5h eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7de5c24b94f [] []}} ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.179 [INFO][5116] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.418 [INFO][5150] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" HandleID="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Workload="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.444 [INFO][5150] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" HandleID="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Workload="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004a0910), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-76db674964-8lv5h", "timestamp":"2025-01-13 21:07:31.418419565 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.444 [INFO][5150] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.618 [INFO][5150] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.619 [INFO][5150] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.625 [INFO][5150] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.637 [INFO][5150] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.651 [INFO][5150] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.654 [INFO][5150] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.655 [INFO][5150] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.656 [INFO][5150] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.658 [INFO][5150] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.663 [INFO][5150] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.667 [INFO][5150] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.667 [INFO][5150] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" host="localhost" Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.667 [INFO][5150] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jan 13 21:07:31.698661 containerd[1650]: 2025-01-13 21:07:31.667 [INFO][5150] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" HandleID="k8s-pod-network.7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Workload="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" Jan 13 21:07:31.699332 containerd[1650]: 2025-01-13 21:07:31.673 [INFO][5116] cni-plugin/k8s.go 386: Populated endpoint ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0", GenerateName:"calico-kube-controllers-76db674964-", Namespace:"calico-system", SelfLink:"", UID:"78856142-3d1a-4f19-a50a-e5f8505c2330", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76db674964", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-76db674964-8lv5h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7de5c24b94f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.699332 containerd[1650]: 2025-01-13 21:07:31.674 [INFO][5116] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" Jan 13 21:07:31.699332 containerd[1650]: 2025-01-13 21:07:31.674 [INFO][5116] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7de5c24b94f ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" Jan 13 21:07:31.699332 containerd[1650]: 2025-01-13 21:07:31.687 [INFO][5116] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" Jan 13 21:07:31.699332 containerd[1650]: 2025-01-13 21:07:31.687 [INFO][5116] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0", GenerateName:"calico-kube-controllers-76db674964-", Namespace:"calico-system", SelfLink:"", UID:"78856142-3d1a-4f19-a50a-e5f8505c2330", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2025, time.January, 13, 21, 7, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76db674964", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d", Pod:"calico-kube-controllers-76db674964-8lv5h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7de5c24b94f", MAC:"2a:e3:03:88:2a:44", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Jan 13 21:07:31.699332 containerd[1650]: 2025-01-13 21:07:31.696 [INFO][5116] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d" Namespace="calico-system" Pod="calico-kube-controllers-76db674964-8lv5h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--76db674964--8lv5h-eth0" Jan 13 21:07:31.715657 systemd-resolved[1547]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 21:07:31.721655 systemd-resolved[1547]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 21:07:31.733680 containerd[1650]: time="2025-01-13T21:07:31.732975857Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:31.733680 containerd[1650]: time="2025-01-13T21:07:31.733013507Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:31.733680 containerd[1650]: time="2025-01-13T21:07:31.733024610Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.735295 containerd[1650]: time="2025-01-13T21:07:31.734026376Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.757457 containerd[1650]: time="2025-01-13T21:07:31.757434556Z" level=info msg="CreateContainer within sandbox \"c337631ef3288dbfccbfe2b1bbc35874be44833f45e3aeca1a7a8bc9a8bf53d0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"485015f1a6b6a08d21eddb0bbf5297bfb0edff84d4454d8be14b2a24f3107a6d\"" Jan 13 21:07:31.757903 containerd[1650]: time="2025-01-13T21:07:31.757852551Z" level=info msg="StartContainer for \"485015f1a6b6a08d21eddb0bbf5297bfb0edff84d4454d8be14b2a24f3107a6d\"" Jan 13 21:07:31.758325 containerd[1650]: time="2025-01-13T21:07:31.757865181Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Jan 13 21:07:31.758325 containerd[1650]: time="2025-01-13T21:07:31.757995564Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Jan 13 21:07:31.758325 containerd[1650]: time="2025-01-13T21:07:31.758011473Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.758325 containerd[1650]: time="2025-01-13T21:07:31.758135404Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.pause\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Jan 13 21:07:31.766816 containerd[1650]: time="2025-01-13T21:07:31.766238225Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-9bjj8,Uid:27d57ea0-66c3-451f-b6eb-d7050a90a389,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1\"" Jan 13 21:07:31.767824 containerd[1650]: time="2025-01-13T21:07:31.767670185Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 21:07:31.769693 containerd[1650]: time="2025-01-13T21:07:31.769663765Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-76f75df574-bbgth,Uid:843fc70f-b236-4057-8b13-ac6f976b6914,Namespace:kube-system,Attempt:7,} returns sandbox id \"6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af\"" Jan 13 21:07:31.779883 containerd[1650]: time="2025-01-13T21:07:31.779859066Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7b989c8495-wsw2f,Uid:ddbbefea-15ee-4654-983a-5f453cee91ee,Namespace:calico-apiserver,Attempt:7,} returns sandbox id \"cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4\"" Jan 13 21:07:31.780841 containerd[1650]: time="2025-01-13T21:07:31.780773299Z" level=info msg="CreateContainer within sandbox \"6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 13 21:07:31.794611 systemd-resolved[1547]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 21:07:31.799948 systemd-resolved[1547]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Jan 13 21:07:31.800645 containerd[1650]: time="2025-01-13T21:07:31.800598095Z" level=info msg="CreateContainer within sandbox \"6788410dfd531ca17707b62aad404d46bf3bfa44e9ed621ec4125b81bdb001af\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"429b099939e20b979cf32e2e164b002d44b26d343ccea5e5b1ea037158937df9\"" Jan 13 21:07:31.803444 containerd[1650]: time="2025-01-13T21:07:31.803425167Z" level=info msg="StartContainer for \"429b099939e20b979cf32e2e164b002d44b26d343ccea5e5b1ea037158937df9\"" Jan 13 21:07:31.805483 containerd[1650]: time="2025-01-13T21:07:31.805469417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-x4zz7,Uid:59df9a6a-247e-477f-9e6c-e3512a44f477,Namespace:calico-system,Attempt:6,} returns sandbox id \"b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1\"" Jan 13 21:07:31.830692 containerd[1650]: time="2025-01-13T21:07:31.830581397Z" level=info msg="StartContainer for \"485015f1a6b6a08d21eddb0bbf5297bfb0edff84d4454d8be14b2a24f3107a6d\" returns successfully" Jan 13 21:07:31.842890 containerd[1650]: time="2025-01-13T21:07:31.842871346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76db674964-8lv5h,Uid:78856142-3d1a-4f19-a50a-e5f8505c2330,Namespace:calico-system,Attempt:7,} returns sandbox id \"7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d\"" Jan 13 21:07:31.855617 containerd[1650]: time="2025-01-13T21:07:31.855593822Z" level=info msg="StartContainer for \"429b099939e20b979cf32e2e164b002d44b26d343ccea5e5b1ea037158937df9\" returns successfully" Jan 13 21:07:32.107197 kubelet[3044]: I0113 21:07:32.106737 3044 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 21:07:32.134934 kubelet[3044]: I0113 21:07:32.133688 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-bbgth" podStartSLOduration=26.133657882 podStartE2EDuration="26.133657882s" podCreationTimestamp="2025-01-13 21:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 21:07:32.110138183 +0000 UTC m=+39.805686672" watchObservedRunningTime="2025-01-13 21:07:32.133657882 +0000 UTC m=+39.829206371" Jan 13 21:07:32.429825 kernel: bpftool[5685]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set Jan 13 21:07:32.565268 systemd-networkd[1293]: vxlan.calico: Link UP Jan 13 21:07:32.565273 systemd-networkd[1293]: vxlan.calico: Gained carrier Jan 13 21:07:32.674922 systemd-networkd[1293]: cali791e56694a1: Gained IPv6LL Jan 13 21:07:32.738935 systemd-networkd[1293]: calib48dc55c397: Gained IPv6LL Jan 13 21:07:32.802997 systemd-networkd[1293]: calib26cc89f0b4: Gained IPv6LL Jan 13 21:07:33.058915 systemd-networkd[1293]: calibe6f4d5991a: Gained IPv6LL Jan 13 21:07:33.103090 kubelet[3044]: I0113 21:07:33.103056 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="kube-system/coredns-76f75df574-bhkcp" podStartSLOduration=27.103028058 podStartE2EDuration="27.103028058s" podCreationTimestamp="2025-01-13 21:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-01-13 21:07:32.135316175 +0000 UTC m=+39.830864671" watchObservedRunningTime="2025-01-13 21:07:33.103028058 +0000 UTC m=+40.798576554" Jan 13 21:07:33.122922 systemd-networkd[1293]: cali7de5c24b94f: Gained IPv6LL Jan 13 21:07:33.506946 systemd-networkd[1293]: cali36b4a776911: Gained IPv6LL Jan 13 21:07:34.403065 systemd-networkd[1293]: vxlan.calico: Gained IPv6LL Jan 13 21:07:34.504757 containerd[1650]: time="2025-01-13T21:07:34.504094871Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:34.504757 containerd[1650]: time="2025-01-13T21:07:34.504530930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=42001404" Jan 13 21:07:34.504757 containerd[1650]: time="2025-01-13T21:07:34.504724907Z" level=info msg="ImageCreate event name:\"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:34.506328 containerd[1650]: time="2025-01-13T21:07:34.506290379Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:34.507240 containerd[1650]: time="2025-01-13T21:07:34.506918364Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 2.739061734s" Jan 13 21:07:34.507240 containerd[1650]: time="2025-01-13T21:07:34.506941924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 21:07:34.508041 containerd[1650]: time="2025-01-13T21:07:34.507404177Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Jan 13 21:07:34.509988 containerd[1650]: time="2025-01-13T21:07:34.509959565Z" level=info msg="CreateContainer within sandbox \"ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 21:07:34.525497 containerd[1650]: time="2025-01-13T21:07:34.525447671Z" level=info msg="CreateContainer within sandbox \"ae6df87e19e47bc6c55998fd61eecb72791b109e44cdab10dd420c5ee3b745c1\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7af62261bda7fbb2ffb6e4436cb0f4bb34e154d42be4f12b46810b730303d817\"" Jan 13 21:07:34.526266 containerd[1650]: time="2025-01-13T21:07:34.525909704Z" level=info msg="StartContainer for \"7af62261bda7fbb2ffb6e4436cb0f4bb34e154d42be4f12b46810b730303d817\"" Jan 13 21:07:34.609963 containerd[1650]: time="2025-01-13T21:07:34.609935348Z" level=info msg="StartContainer for \"7af62261bda7fbb2ffb6e4436cb0f4bb34e154d42be4f12b46810b730303d817\" returns successfully" Jan 13 21:07:34.957332 containerd[1650]: time="2025-01-13T21:07:34.957298146Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:34.957649 containerd[1650]: time="2025-01-13T21:07:34.957617878Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.1: active requests=0, bytes read=77" Jan 13 21:07:34.959354 containerd[1650]: time="2025-01-13T21:07:34.959330678Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" with image id \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486\", size \"43494504\" in 451.90743ms" Jan 13 21:07:34.959417 containerd[1650]: time="2025-01-13T21:07:34.959354537Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Jan 13 21:07:34.960562 containerd[1650]: time="2025-01-13T21:07:34.960512874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Jan 13 21:07:34.970463 containerd[1650]: time="2025-01-13T21:07:34.970369339Z" level=info msg="CreateContainer within sandbox \"cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jan 13 21:07:35.001562 containerd[1650]: time="2025-01-13T21:07:35.001498096Z" level=info msg="CreateContainer within sandbox \"cd606311e5a3e75a881c99e5ec725f390debc6fb85fece6d0669e0af71d7b6a4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"6b7473ac371228f585416ffbe965acc98af3e9ba22b9a5b8822fa2806e2ec30a\"" Jan 13 21:07:35.002242 containerd[1650]: time="2025-01-13T21:07:35.002218780Z" level=info msg="StartContainer for \"6b7473ac371228f585416ffbe965acc98af3e9ba22b9a5b8822fa2806e2ec30a\"" Jan 13 21:07:35.081873 containerd[1650]: time="2025-01-13T21:07:35.081821403Z" level=info msg="StartContainer for \"6b7473ac371228f585416ffbe965acc98af3e9ba22b9a5b8822fa2806e2ec30a\" returns successfully" Jan 13 21:07:35.112609 kubelet[3044]: I0113 21:07:35.112588 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7b989c8495-wsw2f" podStartSLOduration=20.934884574 podStartE2EDuration="24.112561629s" podCreationTimestamp="2025-01-13 21:07:11 +0000 UTC" firstStartedPulling="2025-01-13 21:07:31.781881372 +0000 UTC m=+39.477429859" lastFinishedPulling="2025-01-13 21:07:34.959558419 +0000 UTC m=+42.655106914" observedRunningTime="2025-01-13 21:07:35.111759823 +0000 UTC m=+42.807308319" watchObservedRunningTime="2025-01-13 21:07:35.112561629 +0000 UTC m=+42.808110119" Jan 13 21:07:36.110561 kubelet[3044]: I0113 21:07:36.110230 3044 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 21:07:36.110561 kubelet[3044]: I0113 21:07:36.110466 3044 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 21:07:36.773080 containerd[1650]: time="2025-01-13T21:07:36.773053620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:36.773517 containerd[1650]: time="2025-01-13T21:07:36.773492188Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.1: active requests=0, bytes read=7902632" Jan 13 21:07:36.773743 containerd[1650]: time="2025-01-13T21:07:36.773729707Z" level=info msg="ImageCreate event name:\"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:36.774833 containerd[1650]: time="2025-01-13T21:07:36.774797194Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:36.775461 containerd[1650]: time="2025-01-13T21:07:36.775197783Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.1\" with image id \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3\", size \"9395716\" in 1.814634442s" Jan 13 21:07:36.775461 containerd[1650]: time="2025-01-13T21:07:36.775213621Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Jan 13 21:07:36.775655 containerd[1650]: time="2025-01-13T21:07:36.775640442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Jan 13 21:07:36.778142 containerd[1650]: time="2025-01-13T21:07:36.778128748Z" level=info msg="CreateContainer within sandbox \"b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jan 13 21:07:36.791267 containerd[1650]: time="2025-01-13T21:07:36.791180490Z" level=info msg="CreateContainer within sandbox \"b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0f18a2d9f339d61bfb2b8966a3a429c49a4368586c8773e5f0122375bfad1281\"" Jan 13 21:07:36.791480 containerd[1650]: time="2025-01-13T21:07:36.791465677Z" level=info msg="StartContainer for \"0f18a2d9f339d61bfb2b8966a3a429c49a4368586c8773e5f0122375bfad1281\"" Jan 13 21:07:36.827595 containerd[1650]: time="2025-01-13T21:07:36.827494410Z" level=info msg="StartContainer for \"0f18a2d9f339d61bfb2b8966a3a429c49a4368586c8773e5f0122375bfad1281\" returns successfully" Jan 13 21:07:39.093392 containerd[1650]: time="2025-01-13T21:07:39.093355418Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:39.104592 containerd[1650]: time="2025-01-13T21:07:39.104533134Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.1: active requests=0, bytes read=34141192" Jan 13 21:07:39.112168 containerd[1650]: time="2025-01-13T21:07:39.112103426Z" level=info msg="ImageCreate event name:\"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:39.126781 containerd[1650]: time="2025-01-13T21:07:39.126745686Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:39.133286 containerd[1650]: time="2025-01-13T21:07:39.127149934Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" with image id \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9\", size \"35634244\" in 2.351485257s" Jan 13 21:07:39.133286 containerd[1650]: time="2025-01-13T21:07:39.127190542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Jan 13 21:07:39.133286 containerd[1650]: time="2025-01-13T21:07:39.127660909Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Jan 13 21:07:39.176277 containerd[1650]: time="2025-01-13T21:07:39.176142005Z" level=info msg="CreateContainer within sandbox \"7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jan 13 21:07:39.182853 containerd[1650]: time="2025-01-13T21:07:39.182570260Z" level=info msg="CreateContainer within sandbox \"7f16c468ca28c5f75b2efcbe2edb83742616bb217970d63f79ce268bd8414b6d\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"c7d0470dd27681e4847b32541b050efb353a95d5f8e59fdb90d2fb3ee4db09c0\"" Jan 13 21:07:39.183642 containerd[1650]: time="2025-01-13T21:07:39.183619069Z" level=info msg="StartContainer for \"c7d0470dd27681e4847b32541b050efb353a95d5f8e59fdb90d2fb3ee4db09c0\"" Jan 13 21:07:39.237033 containerd[1650]: time="2025-01-13T21:07:39.237007475Z" level=info msg="StartContainer for \"c7d0470dd27681e4847b32541b050efb353a95d5f8e59fdb90d2fb3ee4db09c0\" returns successfully" Jan 13 21:07:39.453737 kubelet[3044]: I0113 21:07:39.453664 3044 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 21:07:39.857127 kubelet[3044]: I0113 21:07:39.856683 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7b989c8495-9bjj8" podStartSLOduration=26.116957822 podStartE2EDuration="28.856657609s" podCreationTimestamp="2025-01-13 21:07:11 +0000 UTC" firstStartedPulling="2025-01-13 21:07:31.767460132 +0000 UTC m=+39.463008618" lastFinishedPulling="2025-01-13 21:07:34.507159912 +0000 UTC m=+42.202708405" observedRunningTime="2025-01-13 21:07:35.123058868 +0000 UTC m=+42.818607358" watchObservedRunningTime="2025-01-13 21:07:39.856657609 +0000 UTC m=+47.552206100" Jan 13 21:07:40.132208 kubelet[3044]: I0113 21:07:40.132080 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-76db674964-8lv5h" podStartSLOduration=20.84832423 podStartE2EDuration="28.131470156s" podCreationTimestamp="2025-01-13 21:07:12 +0000 UTC" firstStartedPulling="2025-01-13 21:07:31.844233008 +0000 UTC m=+39.539781495" lastFinishedPulling="2025-01-13 21:07:39.12737893 +0000 UTC m=+46.822927421" observedRunningTime="2025-01-13 21:07:40.130869408 +0000 UTC m=+47.826417905" watchObservedRunningTime="2025-01-13 21:07:40.131470156 +0000 UTC m=+47.827018643" Jan 13 21:07:40.857880 containerd[1650]: time="2025-01-13T21:07:40.857848178Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:40.868062 containerd[1650]: time="2025-01-13T21:07:40.866037212Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1: active requests=0, bytes read=10501081" Jan 13 21:07:40.872687 containerd[1650]: time="2025-01-13T21:07:40.872665198Z" level=info msg="ImageCreate event name:\"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:40.877315 containerd[1650]: time="2025-01-13T21:07:40.877292355Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 13 21:07:40.878010 containerd[1650]: time="2025-01-13T21:07:40.877625636Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" with image id \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76\", size \"11994117\" in 1.749918464s" Jan 13 21:07:40.878010 containerd[1650]: time="2025-01-13T21:07:40.877642393Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Jan 13 21:07:40.878782 containerd[1650]: time="2025-01-13T21:07:40.878712664Z" level=info msg="CreateContainer within sandbox \"b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jan 13 21:07:40.935909 containerd[1650]: time="2025-01-13T21:07:40.935870111Z" level=info msg="CreateContainer within sandbox \"b79b4516d71c1f6aa8333dd16208c2f5e5ae792376c6862527166e0a003529e1\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f8319019acae810c15b410cfe30e1af8a0467f68d26a6ca6cf3e7af40ecb247b\"" Jan 13 21:07:40.936840 containerd[1650]: time="2025-01-13T21:07:40.936257906Z" level=info msg="StartContainer for \"f8319019acae810c15b410cfe30e1af8a0467f68d26a6ca6cf3e7af40ecb247b\"" Jan 13 21:07:40.988973 containerd[1650]: time="2025-01-13T21:07:40.988906016Z" level=info msg="StartContainer for \"f8319019acae810c15b410cfe30e1af8a0467f68d26a6ca6cf3e7af40ecb247b\" returns successfully" Jan 13 21:07:41.927135 kubelet[3044]: I0113 21:07:41.927070 3044 csi_plugin.go:99] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jan 13 21:07:41.947527 kubelet[3044]: I0113 21:07:41.947484 3044 csi_plugin.go:112] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jan 13 21:07:51.939144 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:07:51.976671 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:07:51.939167 systemd-resolved[1547]: Flushed all caches. Jan 13 21:07:52.640448 containerd[1650]: time="2025-01-13T21:07:52.640415468Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:52.640877 containerd[1650]: time="2025-01-13T21:07:52.640491565Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:52.640877 containerd[1650]: time="2025-01-13T21:07:52.640499724Z" level=info msg="StopPodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:52.649457 containerd[1650]: time="2025-01-13T21:07:52.649424252Z" level=info msg="RemovePodSandbox for \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:52.654683 containerd[1650]: time="2025-01-13T21:07:52.653124953Z" level=info msg="Forcibly stopping sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\"" Jan 13 21:07:52.654683 containerd[1650]: time="2025-01-13T21:07:52.654453927Z" level=info msg="TearDown network for sandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" successfully" Jan 13 21:07:52.716498 containerd[1650]: time="2025-01-13T21:07:52.716470318Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.743697 containerd[1650]: time="2025-01-13T21:07:52.743662633Z" level=info msg="RemovePodSandbox \"35956ad7da283e5c32b1f7f1dd3a7d3d3ac92e4029a99e2ba9adf8e508fa02cf\" returns successfully" Jan 13 21:07:52.744178 containerd[1650]: time="2025-01-13T21:07:52.744053197Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:52.744178 containerd[1650]: time="2025-01-13T21:07:52.744124528Z" level=info msg="TearDown network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" successfully" Jan 13 21:07:52.744178 containerd[1650]: time="2025-01-13T21:07:52.744133734Z" level=info msg="StopPodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" returns successfully" Jan 13 21:07:52.744477 containerd[1650]: time="2025-01-13T21:07:52.744457898Z" level=info msg="RemovePodSandbox for \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:52.744520 containerd[1650]: time="2025-01-13T21:07:52.744480651Z" level=info msg="Forcibly stopping sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\"" Jan 13 21:07:52.744565 containerd[1650]: time="2025-01-13T21:07:52.744530528Z" level=info msg="TearDown network for sandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" successfully" Jan 13 21:07:52.772432 containerd[1650]: time="2025-01-13T21:07:52.772287911Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.772432 containerd[1650]: time="2025-01-13T21:07:52.772337987Z" level=info msg="RemovePodSandbox \"78d3265d83aa926a07543ad8c21e7220938b589fea58d1f6ef439b0ef43ddab6\" returns successfully" Jan 13 21:07:52.772983 containerd[1650]: time="2025-01-13T21:07:52.772877078Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" Jan 13 21:07:52.773194 containerd[1650]: time="2025-01-13T21:07:52.773170249Z" level=info msg="TearDown network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" successfully" Jan 13 21:07:52.773390 containerd[1650]: time="2025-01-13T21:07:52.773183081Z" level=info msg="StopPodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" returns successfully" Jan 13 21:07:52.785219 containerd[1650]: time="2025-01-13T21:07:52.773640733Z" level=info msg="RemovePodSandbox for \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" Jan 13 21:07:52.785219 containerd[1650]: time="2025-01-13T21:07:52.773655647Z" level=info msg="Forcibly stopping sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\"" Jan 13 21:07:52.785219 containerd[1650]: time="2025-01-13T21:07:52.773694613Z" level=info msg="TearDown network for sandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" successfully" Jan 13 21:07:52.790463 containerd[1650]: time="2025-01-13T21:07:52.790382772Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.790463 containerd[1650]: time="2025-01-13T21:07:52.790414953Z" level=info msg="RemovePodSandbox \"577b6889df06337c793c9b10b8db398248d592813be35a40728f78dc0f81881c\" returns successfully" Jan 13 21:07:52.802160 containerd[1650]: time="2025-01-13T21:07:52.790723113Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\"" Jan 13 21:07:52.802160 containerd[1650]: time="2025-01-13T21:07:52.790773898Z" level=info msg="TearDown network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" successfully" Jan 13 21:07:52.802160 containerd[1650]: time="2025-01-13T21:07:52.790813780Z" level=info msg="StopPodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" returns successfully" Jan 13 21:07:52.802160 containerd[1650]: time="2025-01-13T21:07:52.790944959Z" level=info msg="RemovePodSandbox for \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\"" Jan 13 21:07:52.802160 containerd[1650]: time="2025-01-13T21:07:52.790958718Z" level=info msg="Forcibly stopping sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\"" Jan 13 21:07:52.802160 containerd[1650]: time="2025-01-13T21:07:52.790993439Z" level=info msg="TearDown network for sandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" successfully" Jan 13 21:07:52.811470 containerd[1650]: time="2025-01-13T21:07:52.811452800Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.811618 containerd[1650]: time="2025-01-13T21:07:52.811556162Z" level=info msg="RemovePodSandbox \"0ee24fff9c3922b9c7e7a0d9ccc4ae4d445cf573d621f0baacc92b70abd28b37\" returns successfully" Jan 13 21:07:52.811970 containerd[1650]: time="2025-01-13T21:07:52.811830619Z" level=info msg="StopPodSandbox for \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\"" Jan 13 21:07:52.811970 containerd[1650]: time="2025-01-13T21:07:52.811898803Z" level=info msg="TearDown network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" successfully" Jan 13 21:07:52.811970 containerd[1650]: time="2025-01-13T21:07:52.811916773Z" level=info msg="StopPodSandbox for \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" returns successfully" Jan 13 21:07:52.812624 containerd[1650]: time="2025-01-13T21:07:52.812456606Z" level=info msg="RemovePodSandbox for \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\"" Jan 13 21:07:52.812624 containerd[1650]: time="2025-01-13T21:07:52.812485133Z" level=info msg="Forcibly stopping sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\"" Jan 13 21:07:52.812842 containerd[1650]: time="2025-01-13T21:07:52.812780925Z" level=info msg="TearDown network for sandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" successfully" Jan 13 21:07:52.832032 containerd[1650]: time="2025-01-13T21:07:52.831946769Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.832032 containerd[1650]: time="2025-01-13T21:07:52.831989337Z" level=info msg="RemovePodSandbox \"23c5bf4af3be737f4d04aa69405694b271484db6c27aafd0f97791299c7e4ff1\" returns successfully" Jan 13 21:07:52.832245 containerd[1650]: time="2025-01-13T21:07:52.832224652Z" level=info msg="StopPodSandbox for \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\"" Jan 13 21:07:52.832313 containerd[1650]: time="2025-01-13T21:07:52.832293075Z" level=info msg="TearDown network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\" successfully" Jan 13 21:07:52.832313 containerd[1650]: time="2025-01-13T21:07:52.832307808Z" level=info msg="StopPodSandbox for \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\" returns successfully" Jan 13 21:07:52.834831 containerd[1650]: time="2025-01-13T21:07:52.832574510Z" level=info msg="RemovePodSandbox for \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\"" Jan 13 21:07:52.834831 containerd[1650]: time="2025-01-13T21:07:52.832600168Z" level=info msg="Forcibly stopping sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\"" Jan 13 21:07:52.834831 containerd[1650]: time="2025-01-13T21:07:52.832646035Z" level=info msg="TearDown network for sandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\" successfully" Jan 13 21:07:52.843979 containerd[1650]: time="2025-01-13T21:07:52.843954200Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.844302 containerd[1650]: time="2025-01-13T21:07:52.844085323Z" level=info msg="RemovePodSandbox \"1a1c87cafb0631bcd1045c050fb5c0fe30fbd0e9c98558c16c5d86e3c67c93cc\" returns successfully" Jan 13 21:07:52.844543 containerd[1650]: time="2025-01-13T21:07:52.844437331Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:52.844543 containerd[1650]: time="2025-01-13T21:07:52.844501322Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:52.844543 containerd[1650]: time="2025-01-13T21:07:52.844509812Z" level=info msg="StopPodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:52.845356 containerd[1650]: time="2025-01-13T21:07:52.844770727Z" level=info msg="RemovePodSandbox for \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:52.845356 containerd[1650]: time="2025-01-13T21:07:52.844887635Z" level=info msg="Forcibly stopping sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\"" Jan 13 21:07:52.845356 containerd[1650]: time="2025-01-13T21:07:52.844935010Z" level=info msg="TearDown network for sandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" successfully" Jan 13 21:07:52.860700 containerd[1650]: time="2025-01-13T21:07:52.860443538Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.860700 containerd[1650]: time="2025-01-13T21:07:52.860480460Z" level=info msg="RemovePodSandbox \"f5d5fa0760cfd135da7c5184515a2841ef7a330bf687e5c24c9f86700a79fea7\" returns successfully" Jan 13 21:07:52.860700 containerd[1650]: time="2025-01-13T21:07:52.860683674Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:52.860888 containerd[1650]: time="2025-01-13T21:07:52.860829382Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:52.860888 containerd[1650]: time="2025-01-13T21:07:52.860840398Z" level=info msg="StopPodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:52.861073 containerd[1650]: time="2025-01-13T21:07:52.860990745Z" level=info msg="RemovePodSandbox for \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:52.861073 containerd[1650]: time="2025-01-13T21:07:52.861008541Z" level=info msg="Forcibly stopping sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\"" Jan 13 21:07:52.861073 containerd[1650]: time="2025-01-13T21:07:52.861051691Z" level=info msg="TearDown network for sandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" successfully" Jan 13 21:07:52.885560 containerd[1650]: time="2025-01-13T21:07:52.885497044Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.885686 containerd[1650]: time="2025-01-13T21:07:52.885586516Z" level=info msg="RemovePodSandbox \"aea9192ee1374ec2206af81006427b28f9e1c2925d5f34919460e0268e4b1c0f\" returns successfully" Jan 13 21:07:52.886283 containerd[1650]: time="2025-01-13T21:07:52.886017224Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:52.886283 containerd[1650]: time="2025-01-13T21:07:52.886095010Z" level=info msg="TearDown network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" successfully" Jan 13 21:07:52.886283 containerd[1650]: time="2025-01-13T21:07:52.886107470Z" level=info msg="StopPodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" returns successfully" Jan 13 21:07:52.886400 containerd[1650]: time="2025-01-13T21:07:52.886343782Z" level=info msg="RemovePodSandbox for \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:52.886400 containerd[1650]: time="2025-01-13T21:07:52.886362549Z" level=info msg="Forcibly stopping sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\"" Jan 13 21:07:52.886550 containerd[1650]: time="2025-01-13T21:07:52.886461591Z" level=info msg="TearDown network for sandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" successfully" Jan 13 21:07:52.906868 containerd[1650]: time="2025-01-13T21:07:52.906786357Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.906956 containerd[1650]: time="2025-01-13T21:07:52.906888115Z" level=info msg="RemovePodSandbox \"50d57287063a4e1f81a8d30669666a65737fcca003a36010825b08339ceb092b\" returns successfully" Jan 13 21:07:52.907464 containerd[1650]: time="2025-01-13T21:07:52.907223890Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" Jan 13 21:07:52.907464 containerd[1650]: time="2025-01-13T21:07:52.907310264Z" level=info msg="TearDown network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" successfully" Jan 13 21:07:52.907464 containerd[1650]: time="2025-01-13T21:07:52.907318389Z" level=info msg="StopPodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" returns successfully" Jan 13 21:07:52.908864 containerd[1650]: time="2025-01-13T21:07:52.908238518Z" level=info msg="RemovePodSandbox for \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" Jan 13 21:07:52.908864 containerd[1650]: time="2025-01-13T21:07:52.908256411Z" level=info msg="Forcibly stopping sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\"" Jan 13 21:07:52.908864 containerd[1650]: time="2025-01-13T21:07:52.908297902Z" level=info msg="TearDown network for sandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" successfully" Jan 13 21:07:52.923134 containerd[1650]: time="2025-01-13T21:07:52.923105391Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.923244 containerd[1650]: time="2025-01-13T21:07:52.923179082Z" level=info msg="RemovePodSandbox \"dbcb580c491405d2a42bfd15434c3baab6ea2a37d87006ec93d98fd622af73d6\" returns successfully" Jan 13 21:07:52.923572 containerd[1650]: time="2025-01-13T21:07:52.923554250Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\"" Jan 13 21:07:52.923633 containerd[1650]: time="2025-01-13T21:07:52.923618164Z" level=info msg="TearDown network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" successfully" Jan 13 21:07:52.923633 containerd[1650]: time="2025-01-13T21:07:52.923628179Z" level=info msg="StopPodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" returns successfully" Jan 13 21:07:52.923821 containerd[1650]: time="2025-01-13T21:07:52.923793887Z" level=info msg="RemovePodSandbox for \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\"" Jan 13 21:07:52.923879 containerd[1650]: time="2025-01-13T21:07:52.923854047Z" level=info msg="Forcibly stopping sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\"" Jan 13 21:07:52.923963 containerd[1650]: time="2025-01-13T21:07:52.923932676Z" level=info msg="TearDown network for sandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" successfully" Jan 13 21:07:52.954431 containerd[1650]: time="2025-01-13T21:07:52.954395499Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.954530 containerd[1650]: time="2025-01-13T21:07:52.954444177Z" level=info msg="RemovePodSandbox \"5a37f595e88b51b33646d256ea5af5965dd5322eff3672504a7cdea5db321771\" returns successfully" Jan 13 21:07:52.954770 containerd[1650]: time="2025-01-13T21:07:52.954753112Z" level=info msg="StopPodSandbox for \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\"" Jan 13 21:07:52.954842 containerd[1650]: time="2025-01-13T21:07:52.954827189Z" level=info msg="TearDown network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" successfully" Jan 13 21:07:52.954842 containerd[1650]: time="2025-01-13T21:07:52.954838392Z" level=info msg="StopPodSandbox for \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" returns successfully" Jan 13 21:07:52.954985 containerd[1650]: time="2025-01-13T21:07:52.954970081Z" level=info msg="RemovePodSandbox for \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\"" Jan 13 21:07:52.959378 containerd[1650]: time="2025-01-13T21:07:52.954984184Z" level=info msg="Forcibly stopping sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\"" Jan 13 21:07:52.959378 containerd[1650]: time="2025-01-13T21:07:52.959012635Z" level=info msg="TearDown network for sandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" successfully" Jan 13 21:07:52.995628 containerd[1650]: time="2025-01-13T21:07:52.995578234Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:52.995628 containerd[1650]: time="2025-01-13T21:07:52.995627940Z" level=info msg="RemovePodSandbox \"f272d1af3f2994426d37c55ea5ea97890fe7e7a8544b90817279de9e5b7423b1\" returns successfully" Jan 13 21:07:52.995952 containerd[1650]: time="2025-01-13T21:07:52.995934037Z" level=info msg="StopPodSandbox for \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\"" Jan 13 21:07:52.996022 containerd[1650]: time="2025-01-13T21:07:52.995999836Z" level=info msg="TearDown network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\" successfully" Jan 13 21:07:52.996022 containerd[1650]: time="2025-01-13T21:07:52.996010590Z" level=info msg="StopPodSandbox for \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\" returns successfully" Jan 13 21:07:52.996201 containerd[1650]: time="2025-01-13T21:07:52.996181737Z" level=info msg="RemovePodSandbox for \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\"" Jan 13 21:07:52.996261 containerd[1650]: time="2025-01-13T21:07:52.996247021Z" level=info msg="Forcibly stopping sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\"" Jan 13 21:07:52.996334 containerd[1650]: time="2025-01-13T21:07:52.996294442Z" level=info msg="TearDown network for sandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\" successfully" Jan 13 21:07:53.065176 containerd[1650]: time="2025-01-13T21:07:53.065142388Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.065176 containerd[1650]: time="2025-01-13T21:07:53.065192243Z" level=info msg="RemovePodSandbox \"f74a82f09b6c760df088a4d50093d242a6a23f45b959053e214db53636dc94b6\" returns successfully" Jan 13 21:07:53.065530 containerd[1650]: time="2025-01-13T21:07:53.065490704Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:53.065561 containerd[1650]: time="2025-01-13T21:07:53.065553310Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:53.065588 containerd[1650]: time="2025-01-13T21:07:53.065561519Z" level=info msg="StopPodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:53.066404 containerd[1650]: time="2025-01-13T21:07:53.065774174Z" level=info msg="RemovePodSandbox for \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:53.066404 containerd[1650]: time="2025-01-13T21:07:53.065793955Z" level=info msg="Forcibly stopping sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\"" Jan 13 21:07:53.066404 containerd[1650]: time="2025-01-13T21:07:53.065860751Z" level=info msg="TearDown network for sandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" successfully" Jan 13 21:07:53.131597 containerd[1650]: time="2025-01-13T21:07:53.131437922Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.131597 containerd[1650]: time="2025-01-13T21:07:53.131497167Z" level=info msg="RemovePodSandbox \"c23bf655e1f7572c7e02e26cb33bba0bec8ea9caafffbd291247c0da6c4e94a3\" returns successfully" Jan 13 21:07:53.132150 containerd[1650]: time="2025-01-13T21:07:53.131962696Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:53.132150 containerd[1650]: time="2025-01-13T21:07:53.132033335Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:53.132150 containerd[1650]: time="2025-01-13T21:07:53.132042324Z" level=info msg="StopPodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:53.132542 containerd[1650]: time="2025-01-13T21:07:53.132282255Z" level=info msg="RemovePodSandbox for \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:53.132542 containerd[1650]: time="2025-01-13T21:07:53.132473760Z" level=info msg="Forcibly stopping sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\"" Jan 13 21:07:53.132776 containerd[1650]: time="2025-01-13T21:07:53.132521515Z" level=info msg="TearDown network for sandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" successfully" Jan 13 21:07:53.190466 containerd[1650]: time="2025-01-13T21:07:53.190384823Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.191029 containerd[1650]: time="2025-01-13T21:07:53.190944221Z" level=info msg="RemovePodSandbox \"8f30daf62b1483585b80fca9dda0265c3091a148e8b4a1912746346c82ab2c05\" returns successfully" Jan 13 21:07:53.191904 containerd[1650]: time="2025-01-13T21:07:53.191256296Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:53.191904 containerd[1650]: time="2025-01-13T21:07:53.191329470Z" level=info msg="TearDown network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" successfully" Jan 13 21:07:53.191904 containerd[1650]: time="2025-01-13T21:07:53.191340454Z" level=info msg="StopPodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" returns successfully" Jan 13 21:07:53.191904 containerd[1650]: time="2025-01-13T21:07:53.191666271Z" level=info msg="RemovePodSandbox for \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:53.191904 containerd[1650]: time="2025-01-13T21:07:53.191685056Z" level=info msg="Forcibly stopping sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\"" Jan 13 21:07:53.191904 containerd[1650]: time="2025-01-13T21:07:53.191734581Z" level=info msg="TearDown network for sandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" successfully" Jan 13 21:07:53.234735 containerd[1650]: time="2025-01-13T21:07:53.234517517Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.234735 containerd[1650]: time="2025-01-13T21:07:53.234592228Z" level=info msg="RemovePodSandbox \"12a09d46897f0fc8893b3c04c51546cc32a917f2f22d1f2486baa37ac6aa017a\" returns successfully" Jan 13 21:07:53.235446 containerd[1650]: time="2025-01-13T21:07:53.235421151Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" Jan 13 21:07:53.235518 containerd[1650]: time="2025-01-13T21:07:53.235502453Z" level=info msg="TearDown network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" successfully" Jan 13 21:07:53.235518 containerd[1650]: time="2025-01-13T21:07:53.235512145Z" level=info msg="StopPodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" returns successfully" Jan 13 21:07:53.236495 containerd[1650]: time="2025-01-13T21:07:53.235703554Z" level=info msg="RemovePodSandbox for \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" Jan 13 21:07:53.236495 containerd[1650]: time="2025-01-13T21:07:53.236059616Z" level=info msg="Forcibly stopping sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\"" Jan 13 21:07:53.236695 containerd[1650]: time="2025-01-13T21:07:53.236127269Z" level=info msg="TearDown network for sandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" successfully" Jan 13 21:07:53.264003 containerd[1650]: time="2025-01-13T21:07:53.263950336Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.264003 containerd[1650]: time="2025-01-13T21:07:53.263999131Z" level=info msg="RemovePodSandbox \"f61ca63902cfe4449ecea26add9af5e5fea1e9ec2cb015c5d741adfd2f5f2fb8\" returns successfully" Jan 13 21:07:53.264365 containerd[1650]: time="2025-01-13T21:07:53.264353076Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\"" Jan 13 21:07:53.264505 containerd[1650]: time="2025-01-13T21:07:53.264463327Z" level=info msg="TearDown network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" successfully" Jan 13 21:07:53.264505 containerd[1650]: time="2025-01-13T21:07:53.264473948Z" level=info msg="StopPodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" returns successfully" Jan 13 21:07:53.264880 containerd[1650]: time="2025-01-13T21:07:53.264755625Z" level=info msg="RemovePodSandbox for \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\"" Jan 13 21:07:53.264880 containerd[1650]: time="2025-01-13T21:07:53.264768228Z" level=info msg="Forcibly stopping sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\"" Jan 13 21:07:53.264880 containerd[1650]: time="2025-01-13T21:07:53.264830228Z" level=info msg="TearDown network for sandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" successfully" Jan 13 21:07:53.284197 containerd[1650]: time="2025-01-13T21:07:53.284107571Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.284698 containerd[1650]: time="2025-01-13T21:07:53.284500182Z" level=info msg="RemovePodSandbox \"7fdb06561bb0ba50dd6289c63eb918c9336d83a8954c53c6b065948413692857\" returns successfully" Jan 13 21:07:53.285350 containerd[1650]: time="2025-01-13T21:07:53.285170638Z" level=info msg="StopPodSandbox for \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\"" Jan 13 21:07:53.285676 containerd[1650]: time="2025-01-13T21:07:53.285389874Z" level=info msg="TearDown network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" successfully" Jan 13 21:07:53.285676 containerd[1650]: time="2025-01-13T21:07:53.285670969Z" level=info msg="StopPodSandbox for \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" returns successfully" Jan 13 21:07:53.286784 containerd[1650]: time="2025-01-13T21:07:53.285962600Z" level=info msg="RemovePodSandbox for \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\"" Jan 13 21:07:53.286784 containerd[1650]: time="2025-01-13T21:07:53.285979907Z" level=info msg="Forcibly stopping sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\"" Jan 13 21:07:53.286784 containerd[1650]: time="2025-01-13T21:07:53.286032148Z" level=info msg="TearDown network for sandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" successfully" Jan 13 21:07:53.298637 containerd[1650]: time="2025-01-13T21:07:53.298606825Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.298813 containerd[1650]: time="2025-01-13T21:07:53.298653338Z" level=info msg="RemovePodSandbox \"1f99610232ed5819067743e92768cd2d0287037bb13f7fb31d180609f8bd04be\" returns successfully" Jan 13 21:07:53.299096 containerd[1650]: time="2025-01-13T21:07:53.299085627Z" level=info msg="StopPodSandbox for \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\"" Jan 13 21:07:53.299233 containerd[1650]: time="2025-01-13T21:07:53.299187561Z" level=info msg="TearDown network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\" successfully" Jan 13 21:07:53.299233 containerd[1650]: time="2025-01-13T21:07:53.299197122Z" level=info msg="StopPodSandbox for \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\" returns successfully" Jan 13 21:07:53.304094 containerd[1650]: time="2025-01-13T21:07:53.299397640Z" level=info msg="RemovePodSandbox for \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\"" Jan 13 21:07:53.304094 containerd[1650]: time="2025-01-13T21:07:53.299517104Z" level=info msg="Forcibly stopping sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\"" Jan 13 21:07:53.304094 containerd[1650]: time="2025-01-13T21:07:53.299557205Z" level=info msg="TearDown network for sandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\" successfully" Jan 13 21:07:53.311562 containerd[1650]: time="2025-01-13T21:07:53.311546703Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.311704 containerd[1650]: time="2025-01-13T21:07:53.311641114Z" level=info msg="RemovePodSandbox \"02235effe1a1072f6bcc9de9b0675a4e3fa164e5e8688b824e1dc6eacc4c15df\" returns successfully" Jan 13 21:07:53.311984 containerd[1650]: time="2025-01-13T21:07:53.311968747Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:53.312025 containerd[1650]: time="2025-01-13T21:07:53.312018185Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:53.312075 containerd[1650]: time="2025-01-13T21:07:53.312024983Z" level=info msg="StopPodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:53.312829 containerd[1650]: time="2025-01-13T21:07:53.312172308Z" level=info msg="RemovePodSandbox for \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:53.312829 containerd[1650]: time="2025-01-13T21:07:53.312191727Z" level=info msg="Forcibly stopping sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\"" Jan 13 21:07:53.312829 containerd[1650]: time="2025-01-13T21:07:53.312233760Z" level=info msg="TearDown network for sandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" successfully" Jan 13 21:07:53.326017 containerd[1650]: time="2025-01-13T21:07:53.325995540Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.326082 containerd[1650]: time="2025-01-13T21:07:53.326035652Z" level=info msg="RemovePodSandbox \"3c6b5a0a52b8daeb4a5555032efdeb327e2f7395d966cc2be06fc50ad2cff7f6\" returns successfully" Jan 13 21:07:53.326375 containerd[1650]: time="2025-01-13T21:07:53.326286468Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:53.326375 containerd[1650]: time="2025-01-13T21:07:53.326334650Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:53.326375 containerd[1650]: time="2025-01-13T21:07:53.326342202Z" level=info msg="StopPodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:53.327202 containerd[1650]: time="2025-01-13T21:07:53.326498768Z" level=info msg="RemovePodSandbox for \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:53.327202 containerd[1650]: time="2025-01-13T21:07:53.326512237Z" level=info msg="Forcibly stopping sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\"" Jan 13 21:07:53.327202 containerd[1650]: time="2025-01-13T21:07:53.326569222Z" level=info msg="TearDown network for sandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" successfully" Jan 13 21:07:53.347235 containerd[1650]: time="2025-01-13T21:07:53.347202805Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.347327 containerd[1650]: time="2025-01-13T21:07:53.347251363Z" level=info msg="RemovePodSandbox \"57b46dd2bfb21f499582244fe64f8e81b94c120b4fb953999b5830ef095e99e9\" returns successfully" Jan 13 21:07:53.347566 containerd[1650]: time="2025-01-13T21:07:53.347552274Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:53.351699 containerd[1650]: time="2025-01-13T21:07:53.347609326Z" level=info msg="TearDown network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" successfully" Jan 13 21:07:53.351699 containerd[1650]: time="2025-01-13T21:07:53.347615593Z" level=info msg="StopPodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" returns successfully" Jan 13 21:07:53.351699 containerd[1650]: time="2025-01-13T21:07:53.347753182Z" level=info msg="RemovePodSandbox for \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:53.351699 containerd[1650]: time="2025-01-13T21:07:53.347767184Z" level=info msg="Forcibly stopping sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\"" Jan 13 21:07:53.351699 containerd[1650]: time="2025-01-13T21:07:53.347913032Z" level=info msg="TearDown network for sandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" successfully" Jan 13 21:07:53.369621 containerd[1650]: time="2025-01-13T21:07:53.369510151Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.369621 containerd[1650]: time="2025-01-13T21:07:53.369555710Z" level=info msg="RemovePodSandbox \"1913b37e033bf21bdcb4266b150fa71ca4a30dc4dc5b4a8b4db7f0c8ebc45998\" returns successfully" Jan 13 21:07:53.369891 containerd[1650]: time="2025-01-13T21:07:53.369865439Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" Jan 13 21:07:53.369950 containerd[1650]: time="2025-01-13T21:07:53.369933893Z" level=info msg="TearDown network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" successfully" Jan 13 21:07:53.369950 containerd[1650]: time="2025-01-13T21:07:53.369947314Z" level=info msg="StopPodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" returns successfully" Jan 13 21:07:53.370918 containerd[1650]: time="2025-01-13T21:07:53.370088865Z" level=info msg="RemovePodSandbox for \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" Jan 13 21:07:53.370918 containerd[1650]: time="2025-01-13T21:07:53.370099744Z" level=info msg="Forcibly stopping sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\"" Jan 13 21:07:53.370918 containerd[1650]: time="2025-01-13T21:07:53.370131395Z" level=info msg="TearDown network for sandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" successfully" Jan 13 21:07:53.376981 containerd[1650]: time="2025-01-13T21:07:53.376035883Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.376981 containerd[1650]: time="2025-01-13T21:07:53.376110179Z" level=info msg="RemovePodSandbox \"2b1ee6b3efb3f071453609d36bbd23487921923460640c8810f5c837edfe6f4c\" returns successfully" Jan 13 21:07:53.377779 containerd[1650]: time="2025-01-13T21:07:53.377612498Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\"" Jan 13 21:07:53.377779 containerd[1650]: time="2025-01-13T21:07:53.377711724Z" level=info msg="TearDown network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" successfully" Jan 13 21:07:53.377779 containerd[1650]: time="2025-01-13T21:07:53.377720771Z" level=info msg="StopPodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" returns successfully" Jan 13 21:07:53.377971 containerd[1650]: time="2025-01-13T21:07:53.377926360Z" level=info msg="RemovePodSandbox for \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\"" Jan 13 21:07:53.378001 containerd[1650]: time="2025-01-13T21:07:53.377971667Z" level=info msg="Forcibly stopping sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\"" Jan 13 21:07:53.378058 containerd[1650]: time="2025-01-13T21:07:53.378026265Z" level=info msg="TearDown network for sandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" successfully" Jan 13 21:07:53.380459 containerd[1650]: time="2025-01-13T21:07:53.380439224Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.380892 containerd[1650]: time="2025-01-13T21:07:53.380592174Z" level=info msg="RemovePodSandbox \"1545401b6deaf810df436ac7be358c408265275b422971dddafe84fdfd32f481\" returns successfully" Jan 13 21:07:53.381045 containerd[1650]: time="2025-01-13T21:07:53.381016039Z" level=info msg="StopPodSandbox for \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\"" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.381188063Z" level=info msg="TearDown network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" successfully" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.381196112Z" level=info msg="StopPodSandbox for \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" returns successfully" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.381377956Z" level=info msg="RemovePodSandbox for \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\"" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.381390188Z" level=info msg="Forcibly stopping sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\"" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.381431453Z" level=info msg="TearDown network for sandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" successfully" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.383488636Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.383536089Z" level=info msg="RemovePodSandbox \"f6f65083c55be6afddae682f3016d1997deb4103e554943a00b83548f4f85e25\" returns successfully" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.384035083Z" level=info msg="StopPodSandbox for \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\"" Jan 13 21:07:53.384332 containerd[1650]: time="2025-01-13T21:07:53.384180914Z" level=info msg="TearDown network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\" successfully" Jan 13 21:07:53.384790 containerd[1650]: time="2025-01-13T21:07:53.384189368Z" level=info msg="StopPodSandbox for \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\" returns successfully" Jan 13 21:07:53.385439 containerd[1650]: time="2025-01-13T21:07:53.384909267Z" level=info msg="RemovePodSandbox for \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\"" Jan 13 21:07:53.385439 containerd[1650]: time="2025-01-13T21:07:53.384933055Z" level=info msg="Forcibly stopping sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\"" Jan 13 21:07:53.385439 containerd[1650]: time="2025-01-13T21:07:53.384979344Z" level=info msg="TearDown network for sandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\" successfully" Jan 13 21:07:53.387150 containerd[1650]: time="2025-01-13T21:07:53.387127838Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.387216 containerd[1650]: time="2025-01-13T21:07:53.387172226Z" level=info msg="RemovePodSandbox \"1ecf384bf7534aba6bd7303364656c8aa6658b2d9b9f7be1cc9f674338c58143\" returns successfully" Jan 13 21:07:53.387480 containerd[1650]: time="2025-01-13T21:07:53.387468675Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:53.387626 containerd[1650]: time="2025-01-13T21:07:53.387569606Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:53.387626 containerd[1650]: time="2025-01-13T21:07:53.387580286Z" level=info msg="StopPodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:53.388042 containerd[1650]: time="2025-01-13T21:07:53.387996585Z" level=info msg="RemovePodSandbox for \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:53.388042 containerd[1650]: time="2025-01-13T21:07:53.388010408Z" level=info msg="Forcibly stopping sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\"" Jan 13 21:07:53.388122 containerd[1650]: time="2025-01-13T21:07:53.388048528Z" level=info msg="TearDown network for sandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" successfully" Jan 13 21:07:53.389958 containerd[1650]: time="2025-01-13T21:07:53.389935493Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.389991 containerd[1650]: time="2025-01-13T21:07:53.389968924Z" level=info msg="RemovePodSandbox \"b70200679f5725389046947adbc3bfbb116b65737dd8ca7c39e27d75f8c84d35\" returns successfully" Jan 13 21:07:53.390356 containerd[1650]: time="2025-01-13T21:07:53.390196029Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:53.390356 containerd[1650]: time="2025-01-13T21:07:53.390245392Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:53.390356 containerd[1650]: time="2025-01-13T21:07:53.390256052Z" level=info msg="StopPodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:53.390553 containerd[1650]: time="2025-01-13T21:07:53.390509186Z" level=info msg="RemovePodSandbox for \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:53.390723 containerd[1650]: time="2025-01-13T21:07:53.390614538Z" level=info msg="Forcibly stopping sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\"" Jan 13 21:07:53.390723 containerd[1650]: time="2025-01-13T21:07:53.390670658Z" level=info msg="TearDown network for sandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" successfully" Jan 13 21:07:53.394575 containerd[1650]: time="2025-01-13T21:07:53.394545688Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.394886 containerd[1650]: time="2025-01-13T21:07:53.394701022Z" level=info msg="RemovePodSandbox \"59c7ebdf79f722ea8f120e5abfccd7ef6f8326be38c75c133039e8de74de330b\" returns successfully" Jan 13 21:07:53.395552 containerd[1650]: time="2025-01-13T21:07:53.395534132Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:53.395614 containerd[1650]: time="2025-01-13T21:07:53.395599560Z" level=info msg="TearDown network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" successfully" Jan 13 21:07:53.395676 containerd[1650]: time="2025-01-13T21:07:53.395612693Z" level=info msg="StopPodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" returns successfully" Jan 13 21:07:53.396812 containerd[1650]: time="2025-01-13T21:07:53.395849286Z" level=info msg="RemovePodSandbox for \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:53.396812 containerd[1650]: time="2025-01-13T21:07:53.395864668Z" level=info msg="Forcibly stopping sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\"" Jan 13 21:07:53.396812 containerd[1650]: time="2025-01-13T21:07:53.395901787Z" level=info msg="TearDown network for sandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" successfully" Jan 13 21:07:53.397863 containerd[1650]: time="2025-01-13T21:07:53.397849834Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.397927 containerd[1650]: time="2025-01-13T21:07:53.397917619Z" level=info msg="RemovePodSandbox \"0e1ac1f551d25a587d597e9273b23091eefc2a6351d76e9a9d30ec2656071d69\" returns successfully" Jan 13 21:07:53.398175 containerd[1650]: time="2025-01-13T21:07:53.398158383Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" Jan 13 21:07:53.398227 containerd[1650]: time="2025-01-13T21:07:53.398212998Z" level=info msg="TearDown network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" successfully" Jan 13 21:07:53.398227 containerd[1650]: time="2025-01-13T21:07:53.398223328Z" level=info msg="StopPodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" returns successfully" Jan 13 21:07:53.398439 containerd[1650]: time="2025-01-13T21:07:53.398424423Z" level=info msg="RemovePodSandbox for \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" Jan 13 21:07:53.398506 containerd[1650]: time="2025-01-13T21:07:53.398463157Z" level=info msg="Forcibly stopping sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\"" Jan 13 21:07:53.398570 containerd[1650]: time="2025-01-13T21:07:53.398528545Z" level=info msg="TearDown network for sandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" successfully" Jan 13 21:07:53.400322 containerd[1650]: time="2025-01-13T21:07:53.400302411Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.400360 containerd[1650]: time="2025-01-13T21:07:53.400343506Z" level=info msg="RemovePodSandbox \"5ffd3d58828171f8ce841ad5970beb1b76fc3d88fcc86e57308b2fa5f7cd8ed7\" returns successfully" Jan 13 21:07:53.400530 containerd[1650]: time="2025-01-13T21:07:53.400515043Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\"" Jan 13 21:07:53.400580 containerd[1650]: time="2025-01-13T21:07:53.400566439Z" level=info msg="TearDown network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" successfully" Jan 13 21:07:53.400580 containerd[1650]: time="2025-01-13T21:07:53.400575791Z" level=info msg="StopPodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" returns successfully" Jan 13 21:07:53.400761 containerd[1650]: time="2025-01-13T21:07:53.400740363Z" level=info msg="RemovePodSandbox for \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\"" Jan 13 21:07:53.400761 containerd[1650]: time="2025-01-13T21:07:53.400755133Z" level=info msg="Forcibly stopping sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\"" Jan 13 21:07:53.400865 containerd[1650]: time="2025-01-13T21:07:53.400800020Z" level=info msg="TearDown network for sandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" successfully" Jan 13 21:07:53.402317 containerd[1650]: time="2025-01-13T21:07:53.402299201Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.402351 containerd[1650]: time="2025-01-13T21:07:53.402330610Z" level=info msg="RemovePodSandbox \"6fd018196622426e7868c6f6086ca1ebf2deb72249300f26803d6c2da6e76a82\" returns successfully" Jan 13 21:07:53.403263 containerd[1650]: time="2025-01-13T21:07:53.402546857Z" level=info msg="StopPodSandbox for \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\"" Jan 13 21:07:53.403263 containerd[1650]: time="2025-01-13T21:07:53.402594929Z" level=info msg="TearDown network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" successfully" Jan 13 21:07:53.403263 containerd[1650]: time="2025-01-13T21:07:53.402602478Z" level=info msg="StopPodSandbox for \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" returns successfully" Jan 13 21:07:53.403263 containerd[1650]: time="2025-01-13T21:07:53.402746347Z" level=info msg="RemovePodSandbox for \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\"" Jan 13 21:07:53.403263 containerd[1650]: time="2025-01-13T21:07:53.402759209Z" level=info msg="Forcibly stopping sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\"" Jan 13 21:07:53.403263 containerd[1650]: time="2025-01-13T21:07:53.402793770Z" level=info msg="TearDown network for sandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" successfully" Jan 13 21:07:53.404445 containerd[1650]: time="2025-01-13T21:07:53.404432799Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.404504 containerd[1650]: time="2025-01-13T21:07:53.404495195Z" level=info msg="RemovePodSandbox \"a482756cac331f2725ce60bed10dac369b603d75c4c58c60e08ea272cfd47733\" returns successfully" Jan 13 21:07:53.404739 containerd[1650]: time="2025-01-13T21:07:53.404720071Z" level=info msg="StopPodSandbox for \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\"" Jan 13 21:07:53.404788 containerd[1650]: time="2025-01-13T21:07:53.404776152Z" level=info msg="TearDown network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\" successfully" Jan 13 21:07:53.404821 containerd[1650]: time="2025-01-13T21:07:53.404786664Z" level=info msg="StopPodSandbox for \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\" returns successfully" Jan 13 21:07:53.405241 containerd[1650]: time="2025-01-13T21:07:53.404983474Z" level=info msg="RemovePodSandbox for \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\"" Jan 13 21:07:53.405241 containerd[1650]: time="2025-01-13T21:07:53.404996759Z" level=info msg="Forcibly stopping sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\"" Jan 13 21:07:53.405241 containerd[1650]: time="2025-01-13T21:07:53.405037321Z" level=info msg="TearDown network for sandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\" successfully" Jan 13 21:07:53.406529 containerd[1650]: time="2025-01-13T21:07:53.406516558Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.407629 containerd[1650]: time="2025-01-13T21:07:53.406613478Z" level=info msg="RemovePodSandbox \"019b3337f6013d54afc9ce0ee0fe680b0e99a71e8e230def8e0a836c6a4b91cb\" returns successfully" Jan 13 21:07:53.407629 containerd[1650]: time="2025-01-13T21:07:53.406798184Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:53.407629 containerd[1650]: time="2025-01-13T21:07:53.406891783Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:53.407629 containerd[1650]: time="2025-01-13T21:07:53.406902882Z" level=info msg="StopPodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:53.407629 containerd[1650]: time="2025-01-13T21:07:53.407044868Z" level=info msg="RemovePodSandbox for \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:53.407629 containerd[1650]: time="2025-01-13T21:07:53.407056442Z" level=info msg="Forcibly stopping sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\"" Jan 13 21:07:53.407629 containerd[1650]: time="2025-01-13T21:07:53.407097946Z" level=info msg="TearDown network for sandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" successfully" Jan 13 21:07:53.408946 containerd[1650]: time="2025-01-13T21:07:53.408780121Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.408946 containerd[1650]: time="2025-01-13T21:07:53.408821171Z" level=info msg="RemovePodSandbox \"a509f0e4a628a41ddcf317307de328dbdad39ab4a330e2ea773fc334d23681eb\" returns successfully" Jan 13 21:07:53.409020 containerd[1650]: time="2025-01-13T21:07:53.409002234Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:53.409065 containerd[1650]: time="2025-01-13T21:07:53.409048010Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:53.409065 containerd[1650]: time="2025-01-13T21:07:53.409061326Z" level=info msg="StopPodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:53.409916 containerd[1650]: time="2025-01-13T21:07:53.409904566Z" level=info msg="RemovePodSandbox for \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:53.410068 containerd[1650]: time="2025-01-13T21:07:53.409960528Z" level=info msg="Forcibly stopping sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\"" Jan 13 21:07:53.410068 containerd[1650]: time="2025-01-13T21:07:53.410000413Z" level=info msg="TearDown network for sandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" successfully" Jan 13 21:07:53.411655 containerd[1650]: time="2025-01-13T21:07:53.411580752Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.411655 containerd[1650]: time="2025-01-13T21:07:53.411616367Z" level=info msg="RemovePodSandbox \"4c68bcc3ef449b0338a76bd2755d2af06da1928fe1e7bff0cda8f734a678f13d\" returns successfully" Jan 13 21:07:53.411933 containerd[1650]: time="2025-01-13T21:07:53.411819156Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:53.411933 containerd[1650]: time="2025-01-13T21:07:53.411894586Z" level=info msg="TearDown network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" successfully" Jan 13 21:07:53.412012 containerd[1650]: time="2025-01-13T21:07:53.412000033Z" level=info msg="StopPodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" returns successfully" Jan 13 21:07:53.412348 containerd[1650]: time="2025-01-13T21:07:53.412261961Z" level=info msg="RemovePodSandbox for \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:53.412348 containerd[1650]: time="2025-01-13T21:07:53.412312474Z" level=info msg="Forcibly stopping sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\"" Jan 13 21:07:53.412491 containerd[1650]: time="2025-01-13T21:07:53.412420920Z" level=info msg="TearDown network for sandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" successfully" Jan 13 21:07:53.413935 containerd[1650]: time="2025-01-13T21:07:53.413860806Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.413935 containerd[1650]: time="2025-01-13T21:07:53.413892236Z" level=info msg="RemovePodSandbox \"e445ff62ba8986ad45dd33587de0da277438c1512bb3344f67c3e148875f76c4\" returns successfully" Jan 13 21:07:53.414096 containerd[1650]: time="2025-01-13T21:07:53.414077231Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" Jan 13 21:07:53.414244 containerd[1650]: time="2025-01-13T21:07:53.414225368Z" level=info msg="TearDown network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" successfully" Jan 13 21:07:53.414280 containerd[1650]: time="2025-01-13T21:07:53.414242560Z" level=info msg="StopPodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" returns successfully" Jan 13 21:07:53.414858 containerd[1650]: time="2025-01-13T21:07:53.414404460Z" level=info msg="RemovePodSandbox for \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" Jan 13 21:07:53.414858 containerd[1650]: time="2025-01-13T21:07:53.414420160Z" level=info msg="Forcibly stopping sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\"" Jan 13 21:07:53.414858 containerd[1650]: time="2025-01-13T21:07:53.414456020Z" level=info msg="TearDown network for sandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" successfully" Jan 13 21:07:53.415865 containerd[1650]: time="2025-01-13T21:07:53.415847597Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.415914 containerd[1650]: time="2025-01-13T21:07:53.415879102Z" level=info msg="RemovePodSandbox \"f7c139f2931bd270bd6250eb287ba4467cdb708cec2754497b7a3ae870fe8ff5\" returns successfully" Jan 13 21:07:53.416236 containerd[1650]: time="2025-01-13T21:07:53.416108461Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\"" Jan 13 21:07:53.416236 containerd[1650]: time="2025-01-13T21:07:53.416182977Z" level=info msg="TearDown network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" successfully" Jan 13 21:07:53.416236 containerd[1650]: time="2025-01-13T21:07:53.416195087Z" level=info msg="StopPodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" returns successfully" Jan 13 21:07:53.416451 containerd[1650]: time="2025-01-13T21:07:53.416384042Z" level=info msg="RemovePodSandbox for \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\"" Jan 13 21:07:53.416580 containerd[1650]: time="2025-01-13T21:07:53.416419544Z" level=info msg="Forcibly stopping sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\"" Jan 13 21:07:53.416580 containerd[1650]: time="2025-01-13T21:07:53.416543187Z" level=info msg="TearDown network for sandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" successfully" Jan 13 21:07:53.418409 containerd[1650]: time="2025-01-13T21:07:53.418337870Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.418409 containerd[1650]: time="2025-01-13T21:07:53.418371101Z" level=info msg="RemovePodSandbox \"099087a5bf61a6ad326352ce0be22559982cdf5d097c369213acba478f1441e1\" returns successfully" Jan 13 21:07:53.418647 containerd[1650]: time="2025-01-13T21:07:53.418630620Z" level=info msg="StopPodSandbox for \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\"" Jan 13 21:07:53.418791 containerd[1650]: time="2025-01-13T21:07:53.418682140Z" level=info msg="TearDown network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" successfully" Jan 13 21:07:53.418791 containerd[1650]: time="2025-01-13T21:07:53.418689233Z" level=info msg="StopPodSandbox for \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" returns successfully" Jan 13 21:07:53.419382 containerd[1650]: time="2025-01-13T21:07:53.418907381Z" level=info msg="RemovePodSandbox for \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\"" Jan 13 21:07:53.419382 containerd[1650]: time="2025-01-13T21:07:53.418920684Z" level=info msg="Forcibly stopping sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\"" Jan 13 21:07:53.419382 containerd[1650]: time="2025-01-13T21:07:53.418954741Z" level=info msg="TearDown network for sandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" successfully" Jan 13 21:07:53.420312 containerd[1650]: time="2025-01-13T21:07:53.420295511Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.420963 containerd[1650]: time="2025-01-13T21:07:53.420325226Z" level=info msg="RemovePodSandbox \"2baa5a600c0f2113e46e0aed7bb7138b30194c3e74d90df8ed25e053d18ee6ee\" returns successfully" Jan 13 21:07:53.420963 containerd[1650]: time="2025-01-13T21:07:53.420748268Z" level=info msg="StopPodSandbox for \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\"" Jan 13 21:07:53.420963 containerd[1650]: time="2025-01-13T21:07:53.420819213Z" level=info msg="TearDown network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\" successfully" Jan 13 21:07:53.420963 containerd[1650]: time="2025-01-13T21:07:53.420828336Z" level=info msg="StopPodSandbox for \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\" returns successfully" Jan 13 21:07:53.421844 containerd[1650]: time="2025-01-13T21:07:53.421135484Z" level=info msg="RemovePodSandbox for \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\"" Jan 13 21:07:53.421844 containerd[1650]: time="2025-01-13T21:07:53.421151092Z" level=info msg="Forcibly stopping sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\"" Jan 13 21:07:53.421844 containerd[1650]: time="2025-01-13T21:07:53.421193627Z" level=info msg="TearDown network for sandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\" successfully" Jan 13 21:07:53.422791 containerd[1650]: time="2025-01-13T21:07:53.422772191Z" level=warning msg="Failed to get podSandbox status for container event for sandboxID \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\": an error occurred when try to find sandbox: not found. Sending the event with nil podSandboxStatus." Jan 13 21:07:53.422851 containerd[1650]: time="2025-01-13T21:07:53.422831823Z" level=info msg="RemovePodSandbox \"8500afd184705a58c8f39db00837d33b040955ebd6ee8a4c8f2e6c3232acb148\" returns successfully" Jan 13 21:07:53.987010 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:07:53.987016 systemd-resolved[1547]: Flushed all caches. Jan 13 21:07:53.988827 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:00.284731 kubelet[3044]: I0113 21:08:00.284709 3044 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 21:08:00.370006 kubelet[3044]: I0113 21:08:00.369430 3044 pod_startup_latency_tracker.go:102] "Observed pod startup duration" pod="calico-system/csi-node-driver-x4zz7" podStartSLOduration=39.298276349 podStartE2EDuration="48.369397281s" podCreationTimestamp="2025-01-13 21:07:12 +0000 UTC" firstStartedPulling="2025-01-13 21:07:31.806656056 +0000 UTC m=+39.502204542" lastFinishedPulling="2025-01-13 21:07:40.877776987 +0000 UTC m=+48.573325474" observedRunningTime="2025-01-13 21:07:41.155534989 +0000 UTC m=+48.851083487" watchObservedRunningTime="2025-01-13 21:08:00.369397281 +0000 UTC m=+68.064945773" Jan 13 21:08:09.987903 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:09.987324 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:09.987331 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:10.933277 systemd[1]: Started sshd@7-139.178.70.107:22-147.75.109.163:45226.service - OpenSSH per-connection server daemon (147.75.109.163:45226). Jan 13 21:08:11.090852 sshd[6142]: Accepted publickey for core from 147.75.109.163 port 45226 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:11.096163 sshd-session[6142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:11.105679 systemd-logind[1629]: New session 10 of user core. Jan 13 21:08:11.110996 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 13 21:08:11.754306 sshd[6145]: Connection closed by 147.75.109.163 port 45226 Jan 13 21:08:11.754709 sshd-session[6142]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:11.757062 systemd[1]: sshd@7-139.178.70.107:22-147.75.109.163:45226.service: Deactivated successfully. Jan 13 21:08:11.759041 systemd[1]: session-10.scope: Deactivated successfully. Jan 13 21:08:11.759253 systemd-logind[1629]: Session 10 logged out. Waiting for processes to exit. Jan 13 21:08:11.760016 systemd-logind[1629]: Removed session 10. Jan 13 21:08:16.763471 systemd[1]: Started sshd@8-139.178.70.107:22-147.75.109.163:45242.service - OpenSSH per-connection server daemon (147.75.109.163:45242). Jan 13 21:08:16.797991 sshd[6167]: Accepted publickey for core from 147.75.109.163 port 45242 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:16.799249 sshd-session[6167]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:16.801939 systemd-logind[1629]: New session 11 of user core. Jan 13 21:08:16.805030 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 13 21:08:17.181449 sshd[6170]: Connection closed by 147.75.109.163 port 45242 Jan 13 21:08:17.200113 sshd-session[6167]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:17.219346 systemd[1]: sshd@8-139.178.70.107:22-147.75.109.163:45242.service: Deactivated successfully. Jan 13 21:08:17.220748 systemd[1]: session-11.scope: Deactivated successfully. Jan 13 21:08:17.220764 systemd-logind[1629]: Session 11 logged out. Waiting for processes to exit. Jan 13 21:08:17.221599 systemd-logind[1629]: Removed session 11. Jan 13 21:08:21.343647 kubelet[3044]: I0113 21:08:21.343419 3044 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 13 21:08:22.188951 systemd[1]: Started sshd@9-139.178.70.107:22-147.75.109.163:52842.service - OpenSSH per-connection server daemon (147.75.109.163:52842). Jan 13 21:08:22.495899 sshd[6185]: Accepted publickey for core from 147.75.109.163 port 52842 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:22.503084 sshd-session[6185]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:22.516921 systemd-logind[1629]: New session 12 of user core. Jan 13 21:08:22.520014 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 13 21:08:22.778322 sshd[6188]: Connection closed by 147.75.109.163 port 52842 Jan 13 21:08:22.778905 sshd-session[6185]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:22.781203 systemd[1]: sshd@9-139.178.70.107:22-147.75.109.163:52842.service: Deactivated successfully. Jan 13 21:08:22.783252 systemd[1]: session-12.scope: Deactivated successfully. Jan 13 21:08:22.783597 systemd-logind[1629]: Session 12 logged out. Waiting for processes to exit. Jan 13 21:08:22.784253 systemd-logind[1629]: Removed session 12. Jan 13 21:08:23.939052 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:23.939058 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:23.940822 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:27.787006 systemd[1]: Started sshd@10-139.178.70.107:22-147.75.109.163:56418.service - OpenSSH per-connection server daemon (147.75.109.163:56418). Jan 13 21:08:28.066871 sshd[6218]: Accepted publickey for core from 147.75.109.163 port 56418 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:28.068043 sshd-session[6218]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:28.072118 systemd-logind[1629]: New session 13 of user core. Jan 13 21:08:28.077119 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 13 21:08:28.189694 sshd[6221]: Connection closed by 147.75.109.163 port 56418 Jan 13 21:08:28.192546 sshd-session[6218]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:28.198023 systemd[1]: Started sshd@11-139.178.70.107:22-147.75.109.163:56430.service - OpenSSH per-connection server daemon (147.75.109.163:56430). Jan 13 21:08:28.198320 systemd[1]: sshd@10-139.178.70.107:22-147.75.109.163:56418.service: Deactivated successfully. Jan 13 21:08:28.203531 systemd-logind[1629]: Session 13 logged out. Waiting for processes to exit. Jan 13 21:08:28.203565 systemd[1]: session-13.scope: Deactivated successfully. Jan 13 21:08:28.205652 systemd-logind[1629]: Removed session 13. Jan 13 21:08:28.248209 sshd[6230]: Accepted publickey for core from 147.75.109.163 port 56430 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:28.251342 sshd-session[6230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:28.254818 systemd-logind[1629]: New session 14 of user core. Jan 13 21:08:28.260945 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 13 21:08:28.419742 sshd[6236]: Connection closed by 147.75.109.163 port 56430 Jan 13 21:08:28.419993 sshd-session[6230]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:28.426005 systemd[1]: Started sshd@12-139.178.70.107:22-147.75.109.163:56444.service - OpenSSH per-connection server daemon (147.75.109.163:56444). Jan 13 21:08:28.426262 systemd[1]: sshd@11-139.178.70.107:22-147.75.109.163:56430.service: Deactivated successfully. Jan 13 21:08:28.428538 systemd-logind[1629]: Session 14 logged out. Waiting for processes to exit. Jan 13 21:08:28.429086 systemd[1]: session-14.scope: Deactivated successfully. Jan 13 21:08:28.430430 systemd-logind[1629]: Removed session 14. Jan 13 21:08:28.590024 sshd[6242]: Accepted publickey for core from 147.75.109.163 port 56444 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:28.599919 sshd-session[6242]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:28.603593 systemd-logind[1629]: New session 15 of user core. Jan 13 21:08:28.609039 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 13 21:08:28.768464 sshd[6248]: Connection closed by 147.75.109.163 port 56444 Jan 13 21:08:28.768768 sshd-session[6242]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:28.770900 systemd[1]: sshd@12-139.178.70.107:22-147.75.109.163:56444.service: Deactivated successfully. Jan 13 21:08:28.773139 systemd[1]: session-15.scope: Deactivated successfully. Jan 13 21:08:28.773378 systemd-logind[1629]: Session 15 logged out. Waiting for processes to exit. Jan 13 21:08:28.773970 systemd-logind[1629]: Removed session 15. Jan 13 21:08:31.939846 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:31.939224 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:31.939231 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:33.774573 systemd[1]: Started sshd@13-139.178.70.107:22-147.75.109.163:56448.service - OpenSSH per-connection server daemon (147.75.109.163:56448). Jan 13 21:08:33.802758 sshd[6263]: Accepted publickey for core from 147.75.109.163 port 56448 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:33.803513 sshd-session[6263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:33.806074 systemd-logind[1629]: New session 16 of user core. Jan 13 21:08:33.808942 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 13 21:08:33.921112 sshd[6266]: Connection closed by 147.75.109.163 port 56448 Jan 13 21:08:33.924087 sshd-session[6263]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:33.928935 systemd-logind[1629]: Session 16 logged out. Waiting for processes to exit. Jan 13 21:08:33.929280 systemd[1]: sshd@13-139.178.70.107:22-147.75.109.163:56448.service: Deactivated successfully. Jan 13 21:08:33.931305 systemd[1]: session-16.scope: Deactivated successfully. Jan 13 21:08:33.932118 systemd-logind[1629]: Removed session 16. Jan 13 21:08:38.928961 systemd[1]: Started sshd@14-139.178.70.107:22-147.75.109.163:41240.service - OpenSSH per-connection server daemon (147.75.109.163:41240). Jan 13 21:08:39.050926 sshd[6279]: Accepted publickey for core from 147.75.109.163 port 41240 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:39.052671 sshd-session[6279]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:39.055781 systemd-logind[1629]: New session 17 of user core. Jan 13 21:08:39.060065 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 13 21:08:39.180021 sshd[6282]: Connection closed by 147.75.109.163 port 41240 Jan 13 21:08:39.182225 sshd-session[6279]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:39.183991 systemd[1]: Started sshd@15-139.178.70.107:22-147.75.109.163:41246.service - OpenSSH per-connection server daemon (147.75.109.163:41246). Jan 13 21:08:39.185035 systemd[1]: sshd@14-139.178.70.107:22-147.75.109.163:41240.service: Deactivated successfully. Jan 13 21:08:39.187756 systemd[1]: session-17.scope: Deactivated successfully. Jan 13 21:08:39.188685 systemd-logind[1629]: Session 17 logged out. Waiting for processes to exit. Jan 13 21:08:39.192386 systemd-logind[1629]: Removed session 17. Jan 13 21:08:39.219559 sshd[6290]: Accepted publickey for core from 147.75.109.163 port 41246 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:39.220326 sshd-session[6290]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:39.222899 systemd-logind[1629]: New session 18 of user core. Jan 13 21:08:39.226939 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 13 21:08:39.885421 sshd[6296]: Connection closed by 147.75.109.163 port 41246 Jan 13 21:08:39.890962 systemd[1]: Started sshd@16-139.178.70.107:22-147.75.109.163:41248.service - OpenSSH per-connection server daemon (147.75.109.163:41248). Jan 13 21:08:39.909149 sshd-session[6290]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:39.915787 systemd[1]: sshd@15-139.178.70.107:22-147.75.109.163:41246.service: Deactivated successfully. Jan 13 21:08:39.915982 systemd-logind[1629]: Session 18 logged out. Waiting for processes to exit. Jan 13 21:08:39.918194 systemd[1]: session-18.scope: Deactivated successfully. Jan 13 21:08:39.919327 systemd-logind[1629]: Removed session 18. Jan 13 21:08:39.951820 sshd[6326]: Accepted publickey for core from 147.75.109.163 port 41248 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:39.954237 sshd-session[6326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:39.957515 systemd-logind[1629]: New session 19 of user core. Jan 13 21:08:39.962991 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 13 21:08:40.003206 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:40.004296 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:40.003210 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:41.847851 sshd[6332]: Connection closed by 147.75.109.163 port 41248 Jan 13 21:08:41.853962 systemd[1]: Started sshd@17-139.178.70.107:22-147.75.109.163:41256.service - OpenSSH per-connection server daemon (147.75.109.163:41256). Jan 13 21:08:41.855641 sshd-session[6326]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:41.876129 systemd[1]: sshd@16-139.178.70.107:22-147.75.109.163:41248.service: Deactivated successfully. Jan 13 21:08:41.880405 systemd[1]: session-19.scope: Deactivated successfully. Jan 13 21:08:41.880787 systemd-logind[1629]: Session 19 logged out. Waiting for processes to exit. Jan 13 21:08:41.881893 systemd-logind[1629]: Removed session 19. Jan 13 21:08:41.936016 sshd[6345]: Accepted publickey for core from 147.75.109.163 port 41256 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:41.937083 sshd-session[6345]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:41.940583 systemd-logind[1629]: New session 20 of user core. Jan 13 21:08:41.946997 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 13 21:08:42.051879 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:42.053373 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:42.051884 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:42.516380 sshd[6365]: Connection closed by 147.75.109.163 port 41256 Jan 13 21:08:42.516929 sshd-session[6345]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:42.521972 systemd[1]: Started sshd@18-139.178.70.107:22-147.75.109.163:41272.service - OpenSSH per-connection server daemon (147.75.109.163:41272). Jan 13 21:08:42.522210 systemd[1]: sshd@17-139.178.70.107:22-147.75.109.163:41256.service: Deactivated successfully. Jan 13 21:08:42.525752 systemd[1]: session-20.scope: Deactivated successfully. Jan 13 21:08:42.527788 systemd-logind[1629]: Session 20 logged out. Waiting for processes to exit. Jan 13 21:08:42.530926 systemd-logind[1629]: Removed session 20. Jan 13 21:08:42.558480 sshd[6371]: Accepted publickey for core from 147.75.109.163 port 41272 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:42.559732 sshd-session[6371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:42.562699 systemd-logind[1629]: New session 21 of user core. Jan 13 21:08:42.573127 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 13 21:08:42.668685 sshd[6377]: Connection closed by 147.75.109.163 port 41272 Jan 13 21:08:42.669071 sshd-session[6371]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:42.671322 systemd[1]: sshd@18-139.178.70.107:22-147.75.109.163:41272.service: Deactivated successfully. Jan 13 21:08:42.673127 systemd[1]: session-21.scope: Deactivated successfully. Jan 13 21:08:42.673749 systemd-logind[1629]: Session 21 logged out. Waiting for processes to exit. Jan 13 21:08:42.674877 systemd-logind[1629]: Removed session 21. Jan 13 21:08:44.098893 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:44.099873 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:44.098898 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:47.675968 systemd[1]: Started sshd@19-139.178.70.107:22-147.75.109.163:60590.service - OpenSSH per-connection server daemon (147.75.109.163:60590). Jan 13 21:08:47.743838 sshd[6391]: Accepted publickey for core from 147.75.109.163 port 60590 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:47.745674 sshd-session[6391]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:47.748588 systemd-logind[1629]: New session 22 of user core. Jan 13 21:08:47.752966 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 13 21:08:48.032533 sshd[6394]: Connection closed by 147.75.109.163 port 60590 Jan 13 21:08:48.032935 sshd-session[6391]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:48.035319 systemd[1]: sshd@19-139.178.70.107:22-147.75.109.163:60590.service: Deactivated successfully. Jan 13 21:08:48.037131 systemd[1]: session-22.scope: Deactivated successfully. Jan 13 21:08:48.037361 systemd-logind[1629]: Session 22 logged out. Waiting for processes to exit. Jan 13 21:08:48.038395 systemd-logind[1629]: Removed session 22. Jan 13 21:08:51.971886 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:51.972995 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:51.971892 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:53.039899 systemd[1]: Started sshd@20-139.178.70.107:22-147.75.109.163:60596.service - OpenSSH per-connection server daemon (147.75.109.163:60596). Jan 13 21:08:53.069315 sshd[6415]: Accepted publickey for core from 147.75.109.163 port 60596 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:53.070170 sshd-session[6415]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:53.072991 systemd-logind[1629]: New session 23 of user core. Jan 13 21:08:53.083308 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 13 21:08:53.191696 sshd[6418]: Connection closed by 147.75.109.163 port 60596 Jan 13 21:08:53.194201 systemd[1]: sshd@20-139.178.70.107:22-147.75.109.163:60596.service: Deactivated successfully. Jan 13 21:08:53.192151 sshd-session[6415]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:53.195606 systemd-logind[1629]: Session 23 logged out. Waiting for processes to exit. Jan 13 21:08:53.196225 systemd[1]: session-23.scope: Deactivated successfully. Jan 13 21:08:53.197403 systemd-logind[1629]: Removed session 23. Jan 13 21:08:53.836991 systemd[1]: run-containerd-runc-k8s.io-c7d0470dd27681e4847b32541b050efb353a95d5f8e59fdb90d2fb3ee4db09c0-runc.yDB5km.mount: Deactivated successfully. Jan 13 21:08:54.018888 systemd-resolved[1547]: Under memory pressure, flushing caches. Jan 13 21:08:54.020018 systemd-journald[1197]: Under memory pressure, flushing caches. Jan 13 21:08:54.018893 systemd-resolved[1547]: Flushed all caches. Jan 13 21:08:58.198010 systemd[1]: Started sshd@21-139.178.70.107:22-147.75.109.163:46670.service - OpenSSH per-connection server daemon (147.75.109.163:46670). Jan 13 21:08:58.239560 sshd[6447]: Accepted publickey for core from 147.75.109.163 port 46670 ssh2: RSA SHA256:WEckMlOLIdfNCxC+bjbAx0q0QvEi5FKedQv7GWCkIpw Jan 13 21:08:58.240465 sshd-session[6447]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 13 21:08:58.243877 systemd-logind[1629]: New session 24 of user core. Jan 13 21:08:58.246994 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 13 21:08:58.438905 sshd[6450]: Connection closed by 147.75.109.163 port 46670 Jan 13 21:08:58.439278 sshd-session[6447]: pam_unix(sshd:session): session closed for user core Jan 13 21:08:58.441660 systemd[1]: sshd@21-139.178.70.107:22-147.75.109.163:46670.service: Deactivated successfully. Jan 13 21:08:58.443677 systemd-logind[1629]: Session 24 logged out. Waiting for processes to exit. Jan 13 21:08:58.444083 systemd[1]: session-24.scope: Deactivated successfully. Jan 13 21:08:58.445381 systemd-logind[1629]: Removed session 24.